This is the year where companies, after discovery phases and teams experimenting, are looking to activate and take advantage of the AI advances.
Generated with Microsoft Designer |
And so, questions emerge, such as “What to democratize when leveraging AI?” There are common scenarios, as well as specific ones, that will depend on the company, and the industry they belong to.
A common scenario, seen in many industries, when democratizing data is the data visualization and reporting. In digital marketing, as an example, data scientists and data analysts can automate reporting, making them available to the client. This is a great enabler as business stakeholders are empowered with easy access and self-serving capabilities. Setting the business in this path of insights and data literacy will evolve to new opportunities and use cases. One of those use cases can be predictive analytics, where data leads can leverage past consumer behaviour, at top funnel (such as bounce rates) or mid to lower funnel (when enriching CRM with web analytics), with the purpose of predicting actions for goals such as new leads or reducing churn. The learnings can be leveraged for purposes such as content strategy, e-commerce, others.
A more specific scenario can be in healthcare. AI can be leveraged first to process diverse information, usually unstructured, from medical historical sets, lab results, data sources of a variety of medical conditions; and secondly, to extract insights from it. This with the goal (one of many) in mind of early diagnosis. Democratizing the use of this technology can help with reducing administration complexities by digitizing electronic health records, and therefore freeing time from those in healthcare.
Another candidate for democratization is storage and computing. You can leverage cloud providers when building and deploying models. With cloud providers, such as Microsoft and Amazon AWS you can access high-end GPU computing power with immediate scalability and substantial cost savings, and within a few clicks, instead of setting your own on-prem data centre.
This democratization can also bring new challenges. As capabilities become available, we are asked to keep building more new features, and AI-powered automation solutions. Therefore, there are risks that AI implementations can surface without the appropriate guardrails. Here are some of those risks:
- Teams without AI literacy.
- Bias. The use of the wrong training data, incorrect features, and/or selecting the incorrect models for the tasks, in addition to biases from the designers (data scientists and engineers), are all parts of bringing bias to your AI models.
- Hallucinations. When LLMs provide nonsensical and/or inaccurate outputs.
- Deepfakes. Leveraging AI for media manipulation, where images, voices, videos, text are altered (or fully generated). The misuse of this technique is a risk for the government, companies and individuals as it could be used to cause harm, such as stealing sensitive and personal information, manipulate and distort context creating fake information, bully, fraud, etc.
Having said, we have a good chance to overcome them by applying frameworks and good practices as part of our operations. Here are elements to consider within such a framework:
- Governance.
- Determine what you are democratizing. This includes internally, between the company’s departments, as well as with external partners. As you are defining your ecosystem and pipelines, you can establish the areas to distribute between teams based on their expertise, knowledge and the guardrails: data ingestion, models and features, etc.
- Ownership and control of data that is used to feed the AI and ML
- Intellectual property (IP) is relevant. This defines not only what technology you will be using, but also if the company is okay to use, as an example, cloud platforms for image and/or audio processing without being worried of confidentiality being breached, and potentially having their “secret sauce” outside of the walls of the organization.
- Frameworks that dictate the type of audiences (users) that will be accessing the systems based on the use cases. For example:
- Those accessing data visualization and reports,
- Those working on model development and fine-tuning
- MLOps. Having the right processes and the tools (e.g. cloud services and CI/CD pipeline) for consistently delivering, responsibly, AI solutions. MLOps will bring to your operations:
- Deployment pipeline
- Monitoring
- Automation for data preprocessing and elements related to the model training process
- QA automation, facilitating the areas of testing and validation
- Potential cost savings related to data storage and efficiencies on cloud services
- Allowing many (Marketers, developers, QA engineers, data engineers, data scientists, others) in the end-to-end AI activation in the organization to access the services provide
- Training and knowledge sharing. The training should be at every level, and promoting AI solution thinking, agility and experimentation within the organization’s culture. This will accelerate achieving your OKRs
As technology progresses, we continue to see providers, such as Amazon AWS, Microsoft Azure, and many others, allowing for the AI technologies to be available to others, allowing many of their customers to use their cloud services to experiment, and determine the value it can unlock. After all, 2024 is the year for companies delivering on the value, the ROI and pushing for the adoption of AI tools.
Microsoft AzureML
Azure Machine Learning is a cloud service for accelerating and managing the machine learning (ML) project lifecycle. ML professionals, data scientists, and engineers can use it in their day-to-day workflows to train and deploy models and manage machine learning operations (MLOps). - from Microsoft documentation
Learn moreGoogle Vertex AI
Vertex AI is a machine learning (ML) platform that lets you train and deploy ML models and AI applications, and customize large language models (LLMs) for use in your AI-powered applications. Vertex AI combines data engineering, data science, and ML engineering workflows - from Google documentation
Benefits to the organization
Cost efficiency.
Teams can leverage publicly available data sets, algorithms and models, allowing them to experiment and introduce capabilities and AI solutions into the company without a big investment.
Mitigate team barriers and skill gaps.
Today, learning AI does not demand a big investment due to the community behind it and AI models available on the cloud. It has also become accessible to everyone, and not only to data scientists and developers. The highly technical teams will be building and training the models; while Gen AI talent (power and casual users well-versed in Gen AI solutions) will use the technology regularly, bringing a lot of efficiencies in their daily work, and therefore becoming valuable to the organization.
A company can also reconsider the geo distribution of their talent, now more than ever, when hybrid and remote working has become the reality for almost all the disciplines within the organization. Companies, with little investment can speed-up the learning, and usage, of AI solutions to their individual contributors across teams, and across geo (including remote talent) a reality, providing a well-balanced and exciting workplace. This pushes for reorganizing from functional silos to integrated, cross-functional teams aligned to products or platforms. This can increase employee satisfaction, reduce time to talent development, boost employee engagement and accelerate the on-boarding process.
Accelerates innovation.
Innovation, generally speaking, starts with research. AI provides a lot of benefits as it brings automation as an enabler. An example can be if a team uploads a set of white papers and other type of documents regarding a certain topic (like the impact behind polluting our rivers) to a model, then you can use it for synthesizing, summarizing and enabling search to facilitate the work of the researchers to bring new ideas to solve the problem at stake.
Prototyping is another great enabler provided by AI. Experimentation and a commitment to evolving ideas are paramount and supports the drive for new sources of growth.