Artificial intelligence, or AI, is a simulation of intelligent human behavior. It’s a computer or system designed to perceive its environment, understand its behaviors, and take action. Consider self-driving cars: AI-driven systems like these integrate AI algorithms, such as machine learning and deep learning, into complex environments that enable automation.
AI is estimated to create $13 trillion in economic value worldwide by 2030, according to a McKinsey forecast.
That’s because AI is transforming engineering in nearly every industry and application area. Beyond automated driving, AI is also used in models that predict machine failure, indicating when they will require maintenance; health and sensor analytics such as patient monitoring systems; and robotic systems that learn and improve directly from experience.
Success with AI requires more than training an AI model, especially in AI-driven systems that make decisions and take action. A solid AI workflow involves preparing the data, creating a model, designing the system on which the model will run, and deploying to hardware or enterprise systems.
Taking raw data and making it useful for an accurate, efficient, and meaningful model is a critical step. In fact, it represents most of your AI effort.
Data preparation requires domain expertise, such as experience in speech and audio signals, navigation and sensor fusion, image and video processing, and radar and lidar. Engineers in these fields are best suited to determine what the critical features of the data are, which are unimportant, and what rare events to consider.
AI also involves prodigious amounts of data. Yet labeling data and images is tedious and time-consuming. Sometimes, you don’t have enough data, especially for safety-critical systems. Generating accurate synthetic data can improve your data sets. In both cases, automation is critical to meeting deadlines.
Key factors for success in modeling AI systems are to:
AI models exist within a complete system. In automated driving systems, AI for perception must integrate with algorithms for localization and path planning and controls for braking, acceleration, and turning.
Consider the AI in predictive maintenance for wind farms and autopilot controls for today’s aircraft.
Complex, AI-driven systems like these require integration and simulation.
AI models need to be deployed to CPUs, GPUs, and/or FPGAs in your final product, whether part of an embedded or edge device, enterprise system, or cloud. AI models running on the embedded or edge device provide the quick results needed in the field, while AI models running in enterprise systems and the cloud provide results from data collected across many devices. Frequently, AI models are deployed to a combination of these systems.
The deployment process is accelerated when you generate code from your models and target your devices. Using code generation optimization techniques and hardware-optimized libraries, you can tune the code to fit the low power profile required by embedded and edge devices or the high-performance needs of enterprise systems and the cloud.
There’s a well-documented shortage of skills in AI. However, engineers and scientists who use MATLAB or Simulink® have the skills and tools necessary to create AI-driven systems in their areas of expertise.
You will spend less time preprocessing data. From time-series sensor data to images to text, MATLAB apps and datatypes significantly reduce the time required to preprocess data. High-level functions make it easy to synchronize disparate time series, replace outliers with interpolated values, filter noisy signals, split raw text into words, and much more. You can quickly visualize your data to understand trends and identify data quality issues with plots and the Live Editor.
MATLAB apps automate ground-truth labeling of image, video, and audio data.
To test algorithms before data is available from sensors or other equipment, you can generate synthetic data from Simulink. This approach is commonly used in automated driving systems such as adaptive cruise control, lane keeping assist, and automatic emergency braking.
AI modeling techniques vary by application.
MATLAB users have deployed thousands of applications for predictive maintenance, sensor analytics, finance, and communication electronics. Statistics and Machine Learning Toolbox™ makes the hard parts of machine learning easy with apps for training and comparing models, advanced signal processing and feature extraction, classification, regression, and clustering algorithms for supervised and unsupervised learning.
ASML, a semiconductor manufacturer, used machine learning techniques to create virtual metrology technology to improve the overlay alignment in the complex structures that make up a chip. “As a process engineer, I had no experience with neural networks or machine learning. I worked through the MATLAB examples to find the best machine learning functions for generating virtual metrology. I couldn’t have done this in C or Python—it would’ve taken too long to find, validate, and integrate the right packages,” explained engineer Emil Schmitt-Weaver.
MATLAB models also have faster execution than open source on most statistical and machine learning computations.
Engineers use MATLAB deep learning capabilities for automated driving, computer vision, speech and natural language processing, and other applications. Deep Learning Toolbox™ lets you create, interconnect, train, and evaluate the layers of a deep neural network. Examples and pretrained networks make it easy to use MATLAB for deep learning, even without knowledge of advanced computer vision algorithms or neural networks.
MATLAB enables engineers to work together across different deep learning frameworks. With support for ONNX, MATLAB allows importing and exporting of the latest models to and from other supported frameworks, including TensorFlow.
In control systems that benefit from learning based on cumulative reward, reinforcement learning is an ideal technique. Reinforcement Learning Toolbox™ lets you train policies using DQN, A2C, DDPG, and other reinforcement learning algorithms. You can use these policies to implement controllers and decision-making algorithms for complex systems such as robots and autonomous systems. You can implement the policies using deep neural networks, polynomials, or lookup tables.
Natural language processing models are commonly used for sentiment analysis, predictive maintenance, and topic modeling. Text Analytics Toolbox™ provides algorithms and visualizations for preprocessing, analyzing, and modeling text data. It lets you extract and process raw text from sources such as equipment logs, news feeds, surveys, operator reports, and social media.
Using machine learning techniques such as LSA, LDA, and word embeddings, you can find clusters and create features from high-dimensional text datasets. Features created with Text Analytics Toolbox can be combined with features from other data sources to build machine learning models that take advantage of textual, numeric, and other types of data.
Complex, AI-driven systems need to integrate with other algorithms. System design and simulation are important because the overall system impacts the effectiveness of AI models. Engineers use Simulink for rapid design iteration and closed-loop testing.
For example, in an automated driving system, you use AI and simulation to design the controller for braking, acceleration, and turning. You use Simulink to design and simulate the system model, and MATLAB for the AI model. You might use software like the Unreal Engine to synthesize the ideal camera image to feed the AI model.
Voyage, which makes self-driving taxis for retirement communities, deployed a Level 3 autonomous vehicle in less than three months. The integrated model sped the process from idea to road testing. Simulink let them safely test in dangerous conditions.
Simulink also lets you generate failure data from known failure conditions. In a wind farm, you might add the synthetic failure data to the measured data from the wind turbines. You can refine your system model to get an accurate predictor of future equipment failures.
AI models in MATLAB can be deployed on embedded devices or boards, edge devices in the field, enterprise systems, or the cloud.
For deep learning models, you can use GPU Coder™ to generate and deploy NVIDIA® CUDA® GPUs. Or generate C code with MATLAB Coder™ for deployment on Intel® and Arm® boards. Vendor-optimized libraries create deployable models with high-performance inference speed.
With MATLAB Production Server™, you can securely deploy to and integrate with enterprise IT systems, data sources, and operational technologies.