Artificial Intelligence (AI) and Machine Learning (ML) are terms we hear a lot in tech and industry press and can be thrown around without context or background. The PR hype accompanying AI/ML should appropriately trigger skepticism among CFOs in terms of business value the technology can generate. Where is the line between hype and reality? To start asking the right questions, it is helpful to understand some key concepts.

Why is Artificial Intelligence Such Big Deal Now?

The concept of artificial intelligence has been around for a long time. According to Wikipedia, the field of artificial intelligence research was founded as an academic discipline in 1956. The massive increase in the amount of data available since the birth of the internet and growth of sensors has created the opportunity to apply A.I. to a wider range of opportunities such as self-driving cars, enabling virtual assistants like Apple's Siri and helping businesses operate more efficiently. It's the amount of data, increase in processing power, new math models and drive for new businesses and/or efficiencies that makes A.I. so compelling.

How are Artificial Intelligence and Machine Learning Related and Different?

Artificial intelligence is the broader category of using computer science (e.g. math and processing power) to receive input and make decisions. Encyclopedia Britannica (yes, it still exists - online) defines Artificial intelligence (A.I.)as the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. Intelligent beings have the ability to learn and adapt.

Machine Learning is a sub-set of Artificial Intelligence that emphasizes the ability of the platform to learn from inputs (more data) and perform better over time. This might result in the machine learning platform choosing a different algorithm altogether to fit to performance or choose different data variables to use in order to predict performance more accurately.

Deep learning is a specific type of machine learning that uses a neural network model (think neurons in the human brain) to learn at a deeper level - hence the name. This requires more data, processing power and advanced modeling for really complex problems.

Artificial Intelligence is the Broader Concept while Machine Learning & Deep Learning are Specific Approaches

Machine Learning is all about Data, Models, Outputs and Improving

Most of the focus of applying advanced technology in the F&A organizations is on machine learning given the types of data sets this group works with. Specifically, structured data like general ledgers, structured accounts, and labeled transactions.

The purpose of machine learning is to develop one or more mathematical formulae (algorithms/models) which can leverage historical data to make predictions about future situations. These models should continue to adjust and improve as more historical data accumulates and the machine learns what is successful and what is not.

The output from a machine learning system generally falls into one of two categories: either as the classificationof an item (i.e. “this flower is a rose”) or as the prediction of the likelihood or probability of an event occurring (i.e. “there is a 57% chance of rain this weekend”).

Over time, classification or prediction models may self-adjust to new and changing patterns as external data systems evolve. This can result in flexible, low-maintenance, and self-evolving systems which require little care and feeding, and which may improve over time as they “discover” additional signals within data. A machine learning system may have the ability to effectively adjust and evolve to changing conditions, much like you or I, and in this way can help augment many manual, human tasks.

Machine Learning Flow Overview

Two Key Machine Learning Concepts CFOs Should Know: Supervised and Unsupervised Learning

To give this discussion context, imagine, as a fun example, that your middle daughter is a junior in high school in the Computer Science club and wants to use machine learning to predict the color of the sky for any minute and day of the year. One of the first decisions she will have to make will be whether she will be using a supervised or an unsupervised approach.

Supervised Learning

Supervised learning is a type of machine learning which relies on humans to provide an initial set of correct answers. Machine learning models using supervised learning can use this initial set to form a basis for future predictions. While the algorithm can be specific to a particular use case, often systems use a more general model that can handle a variety of situations, also learning and adjusting over time.

This initial set of data that a model learns from is often referred to as a “training set.” A good training dataset is a representative sample of historical examples and includes both the original raw input values and the corresponding output value (the correct answer) for each set of examples.

When a dataset has inputs of known type and meaning, the data is considered structured. Structured data often simplifies data retrieval, which may help a scientist more quickly anticipate possible machine learning techniques and approaches. Unstructured data, on the other hand, often must be interpreted and cleaned by other methods before the appropriate learning approach can be identified.

In our sky-color example, our student might decide that a supervised approach seems appropriate and collect structured dataover the month of March that includes the color of the sky for each day and minute (perhaps over a random sample throughout the month). She further decides to limit the number of predictions to four colors: blue, orange, purple, and black. This set of observations would be her training set.

Once the system “learns” from the training set, it can make predictions about new, unseen data, based on the historical examples it has been provided. It may continue to improve over time as each new prediction is monitored by something or someone (a human) and corrections are signaled to the system. For example, a history of incorrect predictions may be incorporated into an addition “lesson” for the model’s algorithms. Imagine, for instance, what kind of predictions we might expect if our sky-color predictions had only been taken at midnight (or noon) of every day; a new, updated “lesson” might be in order.

Unsupervised Learning

Unsupervised Learning differs from supervised learning in two keys areas.

First, unsupervised learning starts without preconceived models or algorithms. Instead, the machine learning system constructs its own algorithms or models based upon patterns and relationships it finds within the data. It assumes a relationship encoded in the data somehow and sets about to find it and organize the data appropriately. Trying to determine sentiment in social media conversations (text is very unstructured data) or wanting to identify groups of customers based on e-commerce transaction data are common uses of unstructured learning.

Second, by definition, unsupervised learning does not require or rely upon tagged data, training data sets, or subsequent intervention to decide treatment of new data. Unsupervised can certainly use these structured data set as part of the overall data input but does not require it. What technically makes it unsupervised is that you are not providing the correct answers as we have done in supervised learning - e.g. the training data set.

Using your daughter’s sky-color predictor example, let’s say instead of recording March data in a table, your daughter instead took pictures of the sky every minute and fed all the photos and times into her system. The system itself would seek to group like colors and minutes and try to establish the relationship between each other - this is called “clustering”.

Three Key Areas AI and Machine Learning Can Help a CFO's Organization

A recent survey of financial professionals by the AICPA and Oracle described opportunities for F&A teams to harness technology to dramatically improve business operations and generate positive revenue growth in three ways:

1) Improve Business Process through Modernization. Intelligent process automation using a combination of Robotic Process Automation, Machine Learning, and integrated workflow can reduce manual, repetitive tasks and allow your workforce to execute on their highest value tasks resulting in more productive employees and lower costs.

2) Generate Deeper Data Insights. Modern technology is enabling the connection of systems into a more centralized view of finances and operations. Artificial Intelligence can help identify patterns for improvement and drive decisioning to increase speed to performance.

3) Increase Business Influence. In the age of data-driven decision making, the finance and accounting teams have an enormous opportunity to move beyond reporting and further into strategy based on the business data they work with every day. Artificial Intelligence and Machine Learning enable this shift by reducing time on repetitive tasks, delivering financial reports more quickly/more accurately and identifying trends that can be used in strategic decisions.

The Time and The Technology is Now

A.I. and M.L. have had significant impacts in many industries and are becoming more mainstream in their applications. It is time for finance and accounting teams to invest consideration cycles in this incredibly powerful technology to move the F&A organization into the next level of efficiency and driving business performance.

We hope this white paper has helped you understand more about the technology and how it can impact your organization and business. Contact us at hello@sigmaiq.comif you would like to learn more about our AI-powered, enterprise-strength matching reconciliation engine.


“Artificial Intelligence”, Encyclopedia Britannica,

“Cousins of Artificial Intelligence”, Seema Singh, Towards Data Science -

“Machine Learning: An Introduction”, Sujeet Kumar Jaiswal,

“Agile Finance Unleashed: The Key Traits of Digital Finance Leaders”, Association of International Certified Professional Accountants and Oracle.