Vincent Gosselin, Founder and CEO at Taipy — Pioneering the Integration of AI and Python: Transforming Industry Challenges into Opportunities

In this interview with Vincent Gosselin, Founder, and CEO at Taipy, we delve into the fascinating journey of a visionary who has witnessed firsthand the ebbs and flows of artificial intelligence (AI) and machine learning (ML) over more than three decades. Vincent’s pioneering work at Taipy, a groundbreaking framework designed to streamline the complex process of bringing ML model prototypes to life in fully operational web applications, marks a significant milestone in the field. His experience encompasses the early optimism of AI innovations, through the challenges of AI winters, to the resurgence of AI as a cornerstone of technological advancement in various industries. Through Vincent’s eyes, we explore the pivotal moments that have shaped the AI landscape, the evolution of Taipy in addressing the unique challenges faced by businesses in integrating AI solutions, and the future directions of AI and ML technologies

Vincent, with over three decades of experience in AI and ML, what pivotal moments have you witnessed in the evolution of these technologies, and how have they reshaped industries?

In the late 1980s, the IT world was bubbling with so-called “AI innovations”: the craze was mostly about Prolog, Expert Systems, and the nascent development of Neural Networks.

Then winter came after 1990… Most companies (the word “start-up” didn’t exist in those days!) selling expert systems or neural networks died, and all projects based on these stopped.

I found myself entangled in a multi-million dollar project aiming to supplant human expertise, a doomed endeavor from the outset due to the rapid pace of business rule changes outpacing the capacity of AI engineers to integrate them into the Expert System. I am not even mentioning the great difficulty of fully testing these complex rule-based systems.

Even the word “AI” was, at the very least, considered “has-been.”

In the industry, only what was truly “working” was used. Very few companies were making a living out of smart intelligent algorithms; ILOG was one of them. I worked for them for many years and had a lot of fun implementing models based on Mathematical Programming or some other tree-search techniques inherited from AI. Surprisingly enough, these little-known fields have been at the heart of the supply chain revolution that impacted all major companies till now.

In 2010, I started to switch progressively to Python programming. The productivity boost that you get from Python (whether you’re a seasoned programmer or a scientist) is short of amazing. Python has become THE language (and is still) for AI and many other science fields. Python swiftly emerged as the language of choice for AI development and numerous scientific fields. However, this surge in productivity failed to permeate other IT sectors (graphical development, Back-end, etc), resulting in many companies struggling to execute successful AI projects for their business needs. This challenge gave rise to Taipy!

As CEO and co-founder of Taipy, could you elaborate on the initial challenges you faced when transitioning ML model prototypes into fully operational web applications, and how you overcame them?

One primary challenge in developing Data/AI projects is navigating the various silos involved: Data Engineers, Data Scientists, IT Front-End and Back-End developers, MLOps, DevOps, and End-Users, each with their own technology stacks. These silos often isolate end-users from the development cycle, leading to prolonged project timelines, inflated costs, heightened expectations, and a lack of established best practices.

Another obstacle stems from the absence of established best practices, potentially exacerbated by inexperienced project managers. For instance, Data Scientists frequently view their role as complete once testing is finished and their algorithm is handed off to the next silo, typically the IT department. This approach yields two critical outcomes: firstly, end-users are often provided with software offering a single solution, lacking interactivity with the AI engine, resulting in poor acceptance. Secondly, even if end-users are initially satisfied, software performance and acceptance often deteriorate over time.

Considering your significant roles at companies like ILOG and IBM, how did these experiences influence the development and vision of Taipy?

I worked for almost 30 years as a Data Scientist, Mentor, Project Manager, and Consulting Director. We worked on strategic projects for top companies all over the world: mostly in Japan, Korea and US. These projects always involved AI, end-users and high ROI.

These experiences heavily influenced the way we designed Taipy.

  • First, Taipy has been designed to build great Decision Support Systems aka “DSS”. This word is not used anymore. A DSS provides end-users with the capability to interact with the software or the AI engine through an intuitive graphical interface and proper backend. For this purpose, Taipy supports “What Analysis” and “Scenario Management”.
  • Many of the projects we did, involved large teams, long cycles, and significant costs, etc. Taipy is designed to combine ease of use with great capacity to customize to meet the actual project requirements. Too often “easy-to-use” software are great for pilot but fail to produce production-ready applications.

You’ve emphasized the importance of Python in data science and ML. In your view, what makes Python stand out as the language of choice for developers in these fields?

Yes, Python has reached that status now. Python is just so flexible (many traditional developers think this is a problem, but I don’t). For those who have never developed using Python, do not consider Python like any programming language.

  • First of all, Python is an ecosystem where the developer has access to thousands of great specialized libraries. All the best AI libraries but also many other libraries from other scientific fields are all there, most of them open-sourced.
  • All students from all fields are now trained in Python, not only Computer Science students. This creates an incredible lingua franca never reached before that incentivizes cross-collaboration between all scientific fields.
  • Last but not least, do not take too much credit for people ranting about Python not being as performant as C++, C, Scala, Java, etc. Most of the time, Python’s functions wrap other code written in efficient languages: C++, C, Scala, etc.

Taipy is described as an innovative solution that bridges a crucial gap in the market. Can you discuss a specific project or case where Taipy dramatically simplified the AI development process for a business?

With Taipy, all participants in the project use the same Python framework. Taipy connects Data Engineers, Data Scientists, software developers, DevOps people, etc.

Taipy helps to efficiently bring an AI algorithm into a full-blown project and into the hands of end-users. Of course, Taipy can also be used to develop pilots, but we have made sure Taipy can scale into real projects. Before Taipy, we couldn’t find such a framework in the Python galaxy.

Consider an AI project undertaken by a prominent European retailer, aimed at predicting cash flow for the entire company over the next three months. Outsourcing the development of such a project would have taken 6-8 months, involved four consultants, cost in excess of 800K, etc. However, the retailer’s IT division/Datalab opted to leverage Taipy for internal development instead. Remarkably, the project was completed by just 1.5 individuals in under three months, leaving end-users highly satisfied. Taipy has since become the standard development platform, catalyzing the initiation of several new AI projects.

Beyond the significant productivity increase Taipy brings, its true value lies in fostering success.

Too often, we see companies involved in lengthy and costly projects that are also poorly accepted by end-users, eventually leading to the software not being used at all…

Looking at the current landscape of AI and automation, what emerging trends do you believe will have the most profound impact on businesses in the next decade?

For businesses, especially for B2B applications, the current AI technology is not able to replace expert business users. The challenge lies in bringing AI technologies as usable and useful tools for business users to enhance their decision-making hence our focus on Decision Support Systems (see above). Several fields are showing some potential such as Hybrid models (i.e. Combining Symbolic models with existing Deep Learning models), Reinforcement Learning, and Knowledge Graphs.

The ability to create complex charts, interactive dashboards, and chatbots directly in Python is a game-changer. How does Taipy manage to streamline these processes while ensuring scalability and performance?

Yes, In our mission to empower Developers in delivering exceptional Decision Support Systems to business users, one can not underestimate the importance of graphics. Taipy provides a full GUI library that combines:

  • ease of use (anyone can develop a great graphical user interface)
  • performance
  • support for large data visuals
  • support for long jobs like Training of AI engine.
  • natively multi-user
  • etc.

Taipy is designed for a broad range of graphical user interfaces, from dashboards and chatbots to specialized interfaces for Decision Support Systems (DSS). In DSS, graphical support for what-if analysis is crucial, allowing users to create new scenarios, explore past scenarios, compare outcomes, and monitor KPIs over time.

Performance is central to Taipy. For example, in Taipy, you can manage 1 million data points on a graphical line chart without compromising response time, all achievable with just a few lines of Python code.

Furthermore, in April, we are launching Taipy Designer, a new product offering a no-code approach. Users can build complete interfaces effortlessly using drag-and-drop graphical widgets on a canvas.

AI’s impact on the workforce is a topic of considerable debate. From your perspective, how can businesses best prepare their workforce for the integration of AI and automation technologies?

Automation has a dual impact, affecting both software producers, such as Data Scientists and Developers, and users, whether they’re business users in B2B software or customers in B2C software.

For those involved in software development, the impact of automation is substantial. Tools like Copilot and ChatGPT significantly accelerate the development cycle, benefiting frameworks like Taipy. However, it’s important to recognize that these tools are just that—tools. They cannot replace the expertise of seasoned developers. Effective utilization requires a clear understanding of objectives, the ability to ask the right questions, and judicious decision-making when utilizing tools like ChatGPT.

The business users, who are the end-users of a Taipy Application, are usually experts in their field. It is wishful thinking to believe that these people can be replaced. This is exactly why we believe the best strategy is to provide Decision Support Systems where Taipy can help them “play” with an AI engine using various options/parameters, allowing them to modify the input/output of an AI engine.

The optimal strategy for companies is to foster an environment where IT and AI/IT technical teams collaborate closely with business users. This entails ensuring that:

  • Business Users understand that AI will enhance their decision-making capabilities without replacing them!
  • The development/IT teams understand how to engage Business Users and capture the essence of building Decision-Support Systems.

You’ve been part of major AI projects across various sectors. Can you share an instance where an AI solution you led or contributed to resulted in transformative outcomes for a company?

One such application or suite of Applications was for McDonald’s. The initial project was about generating the sales forecast for every single store in a given country with a 15-minute precision over the next seven days. The sales forecast is crucial for a McDonald’s store manager since it drives all operations, including the staffing for each store position. The software was designed to produce high-quality forecasts, yet it provided the capability for a manager to influence the forecast using last-minute information (e.g. the competitor next door running a promotion).

More recently, we have seen amazing applications involving large corporations deploying Energy Management Software covering:

  • Long Term planning/purchase of electricity blocks
  • Short Term (day-ahead) prediction of Electricity Price linked with purchase decisions
  • What-If Analysis to estimate the impact on the energy purchase strategy for specific scenarios such as:
    • Increasing the number of consuming locations (factories)
    • transitioning locations to solar energy
    • etc.

Beyond Taipy, how do you envision the future of AI and ML development tools, and what challenges do you see arising in making these technologies more accessible and efficient?

AI and ML tools are continuously improving to make AI engines more powerful and easier to create, test, and deploy. New fields such as “ AutoML” or “AutoAI” propose to automate, up to a certain point, the creation of AI models. Some software companies have even started to provide “AI as a service”.

For instance, Scikit-learn, a renowned open-source library and one of our partners, consistently introduces new tools and methods to streamline the work of Data Scientists.It‘s worth noting that a significant portion of a data scientist’s time—up to 90%—is devoted not to creating the AI engine itself but to sourcing, comprehending, and preparing data to maximize its effectiveness for AI engines.

Despite these advancements, what’s currently lacking is a practical methodology founded on a robust set of best practices to steer AI projects. Mistakes detrimental to the success of AI projects remain prevalent among AI and IT departments, regardless of size. Over-engineering is just one example of such pitfalls.

Your educational background laid a significant theoretical foundation for your career. How important do you believe formal education in computer science and AI is for aspiring professionals in this field today?

Definitely. Understanding the concepts and inner workings behind these different AI technologies is more required than before if you want to understand their strengths, weaknesses and how to avoid major pitfalls on a real project. A solid theoretical foundation is particularly crucial for projects tailored to specific applications.

At the same time, we will also see the emergence of simpler, out-of-the-box, self-service models that do not require extensive background knowledge to be used by Analysts on a large scale for more standard generic types of applications.

Lastly, Vincent, balancing the roles of a CEO, innovator, and technology leader, how do you keep yourself inspired and motivated to continue pushing the boundaries of what’s possible in AI and ML?

Much of our inspiration comes from two primary sources: actively listening to the challenges our customers encounter and staying abreast of breakthroughs and innovations that have the potential to significantly impact both us and our customers.

Maintaining an open mind is crucial, as it allows us to experiment with new techniques, comprehend their strengths, and acknowledge their limitations.

Additionally, our thriving community surrounding Taipy contributes to a continuous influx of ideas and suggestions, enriching our innovation and development process.

About The Author

Scroll to Top
Share via
Copy link
Powered by Social Snap