Large models have emerged as the most recent ground-breaking achievements inartificial intelligence, and particularly machine learning. In 2024 these forms of artificial intelligence are set to disrupt businesses significantly.
In a three-way take on evolving developments in the world of artificial intelligence, three experts at Ikigai Labs have provided Digital Journal readers with some insightful insights in terms of the way they expect the technology to progress into 2024. Each is based on technologies using massive amounts of data to learn billions of parameters and putting these to meaningful business use.
Prediction #1: AI upskilling will become a pivotal requirement to transform into a future-ready, diverse, and equitable workforce
To meet the requirement of AI, workforces will need to adapt and acquire new skills, says Kamal Ahluwalia, President of Ikigai Labs. These skills are generally more complex than those possessed by the current workforce.
This leads Ahluwalia to predict: “With the rapid advancements in AI and the evolving skills required for success in the AI-driven economy, companies will increase their investments in AI training. Training and education programs will emerge to help workers from all backgrounds develop the skills they need to thrive in the new workplace. As AI continues to revolutionize the workplace, its implementation poses a risk to exacerbating existing inequalities and further marginalizing underrepresented groups. AI companies will recognize this risk and take proactive steps to address this issue, such as developing AI systems that are fair and unbiased, as well as ensuring that AI-powered jobs are accessible to all.”
Prediction #2: Next year will see a push for more affordable AI solutions
As AI increases in coverage, the costs for businesses wanting to adopt the technology will start to fall. Ahluwalia takes the view: “Large language models (LLMs) are trained on Internet-scale data, making them very compute-intensive and costly to implement. As a result, we’re seeing most of the investment coming from big companies with deep pockets. This is hindering the ingenuity we’ve come to expect from startups and small companies which ultimately hurts both buyers and providers.”
Smaller players are likely to make the market more dynamic and innovative. Ahluwalia finds: “If AI-powered solutions can only be delivered by the largest companies, then AI will not move at the pace we need it to. This will spur demand in the coming year for more computationally efficient and affordable AI solutions, enabling a more diverse and nimble set of solution providers to deliver AI-powered solutions across a wide range of use cases.”
Prediction #3: AI Will Revolutionize ESG Practices in 2024
On the topic of environmental, social, and governance (ESG), Gopkiran Rao, VP of Product and Solutions Marketing sees advances in technology as providing a means to push this agenda: “With the rise of ESG, practices taking centre stage, organizations will look for more computationally efficient models, such as Large Graphical Models (LGMs), to reduce waste and drive sustainability efforts. By providing businesses with data-driven insights and innovative solutions to tackle environmental challenges, LGMs will revolutionize ESG practices. Organizations that leverage LGMs for AI-powered demand forecasting will pinpoint areas of waste generation and develop targeted strategies for reduction and maximizing resource utilization.”
Prediction #4: LGMs become the next household gen AI tech in the enterprise
Advances with graphical interfaces and AI is expected by Devavrat Shah, Co-CEO and Founder. After toying with large language models, advances in graphical models will inevitably follow.
According to Shah: “Today, nearly every organization is experimenting with LLMs in some way. Next year, another major AI technology will emerge alongside LLMs: Large Graphical Models (LGMs). An LGM is a probabilistic model that uses a graph to represent the conditional dependence structure between a set of random variables. LGMs are probabilistic in nature, aiming to capture the entire joint distribution between all variables of interest. They are particularly suitable for modelling tabular data, such as data found in spreadsheets or tables.”
The reason is because: “LGMs are useful for analysing time series data. By analysing time series data through the novel lens of tabular data, LGMs are able to forecast critical business trends, such as sales, inventory levels and supply chain performance. These insights help guide enterprises to make better decisions.”
As to the implications, Shah suggests: “This is game changing because existing AI models have not adequately addressed the challenge of analysing tabular, time-series data (which accounts for the majority of enterprise data). Instead, LLMs and other models were created to analyse text documents. That’s limited the enterprise use cases they’re really capable of supporting: LLMs are great for building chatbots, but they’re not designed to support detailed predictions and forecasting. Those sort of use cases offer organizations the most business value today – and LGMs are the only technology that enables them.”
In concluding, Shah makes the following assessment: “Enterprises already have tons of time-series data, so it’ll be easy for them to begin getting value from LGMs. As a result, in 2024, LGM adoption will take off, particularly in retail and healthcare.”