FICO Makes Artificial Intelligence Explainable with Latest Release of its Analytics Workbench

AI

AI

FICO aims to make artificial intelligence more explainable with the latest release of its analytics workbench.

On September 12th FICO announced that they had released their latest version of the FICO analytics workbench which is a cloud-based Advanced analytic development environment that empowers business users and data scientist with sophisticated yet easy-to-use data exploration visual data wrangling, decision strategy design, and machine learning. According to the announcement the analytics workbench is explicitly designed for users with a variety of skill sets this includes credit risk officers, data scientist, and business analyst.

One of the main objectives of the toolkit is to help data scientist better understand the machine learning models behind AI derive decisions. When asked about the new analytics workbench Jari Koister, vice president of product management at FICO stated, “As businesses depend on machine learning models more and more, explanation is critical, particularly in the way that AI-derived decisions impact consumers.“Leveraging our more than 60 years of experience in analytics and more than 100 patents filed in machine learning, we are excited at opening up the machine learning black box and making AI explainable. With Analytics Workbench, our customers can gain the insights and transparency needed to support their AI-based decisions.”

As we have seen, artificial intelligence and machine learning have already made played a significant role within the payments industry, and their involvement and integration are only going to an increase as the industry continues to see massive benefits in various areas of their business improve by implementing this technology. According to the assistant professor of computer science at the University of California Irvine, Sameer Singh, “Computers are increasingly a more important part of our lives, and automation is just going to improve over time, so it’s increasingly important to know why these complicated AI and ML systems are making the decisions that they are. The more accurate the algorithm, the harder it is to interpret, especially with deep learning. Explanations are important, they can help non-experts to understand the reasons behind the AI decisions, and help avoid common pitfalls of machine learning.”

Education has always been a barrier of implementing new technologies within an industry, and fortunately, however, it would appear that FICO is taking steps with this new release to make it a little easier on individuals to understand and implement their models.

Exit mobile version