Machine Learning – Generate adaptive models directly from complex datasets for object classification and predictive analytics, such as identifying which new advertising markets to enter.

Report Generation – Display conclusions and insights in a styled, formatted document for meetings, ongoing projects or public information, like a quarterly earnings report.

Time Series – Model, simulate and forecast sequences of events over time to track long-term trends and make predictions, such as expected sales for the next holiday season.

Neural Networks – Create and train layered processing networks for deep analysis and processing tasks, such as recognizing defective items coming off a production line.

Cluster Analysis – Group and analyze data based on similarity measures to extract underlying patterns and relationships, such as which customers are most similar to your top 100.

Graph/Network Analysis – Explore and visualize systems of discrete relationships to analyze correlations and patterns, such as modeling demographics in a social network.

Dynamic Visualization – Display data in styled plots, charts and infographics, making it human-readable and interactive for quick analysis and decision making.

Survival Analysis – Compute survival functions and lifetime distributions to analyze time-to-event data, such as the expected lifetime of a piece of industrial equipment.

Semantic Text Analysis – Analyze underlying structures in linguistic data to clean up data and extract meaning, such as determining sentiment in customer comments.

Data Semantics – Standardize various incoming datasets into a unified framework for easier analysis, such as consolidating data with different unit systems.

Queueing Theory – Model and simulate systems of queues to analyze waiting times and resource allocation, such as the optimal number of tellers at a bank branch.

Wavelets – Deconstruct data signals into constituent parts for advanced manipulation and filtering of specific features, such as eliminating background noise from sensor data.

Systems Modeling – Model physical, electrical and other systems to inform design decisions, like the most effective heating installation for a building.

Statistical Distributions – Fit historic data to parametric distributions to make inferences about the underlying events, such as the likelihood of a customer clicking through an ad.

Random Processes – Model the progression of a system over time to make observations and predictions about its behavior, such as analyzing peak hours at a particular store location.

Optimization – Use high-level mathematics to discover the “best values” for your data in relation to key criteria, such as the ideal allocation of portfolio assets.

Morphological Analysis – Use geometric transformations on images and higher-dimensional data to analyze spatial properties, such as counting particles in a microscopic image.

Computer Vision – Process visual data with machine learning and other sophisticated algorithms for analysis of features and patterns, such as identifying road hazards from a video feed.

Signal Processing – Process and filter images, audio and other collected data to analyze underlying patterns, such as detecting an irregular heartbeat from an ECG.

Custom Interface Construction -Make interactive onscreen controls for real-time adjustment of parameters in analyses and visualizations, allowing deeper exploration of data.

Parallel Computing – Distribute parallel tasks to available computation units for large-scale scientific computing and other high-performance applications.

Geocomputation – Use precise geolocation data and powerful geodetic computations to accurately examine real-world situations, such as visualizing optimal routes for a bus service.

Mathematical Modeling – Drive systems of differential equations, recurrence relations and symbolic formulas with your data to test and refine models, such as computing the recovery rate of an epidemic.

https://www.wolfram.com/data-science-consulting/index.php.en