Google Cloud Platform Case Studies BasisAI: Accelerating AI Adoption with Google Cloud
Edit This Case Study Record
Google Cloud Platform Logo

BasisAI: Accelerating AI Adoption with Google Cloud

Google Cloud Platform
Analytics & Modeling - Machine Learning
Cybersecurity & Privacy - Cloud Security
Equipment & Machinery
Transportation
Product Research & Development
Quality Assurance
Chatbots
Predictive Maintenance
Cloud Planning, Design & Implementation Services
Training

BasisAI, a company that helps enterprises accelerate AI adoption, faced several challenges in its mission to deploy responsible AI applications. The company needed to ensure that the AI systems it helped develop were free of biases, which could potentially lead to loss of consumer trust if certain groups of customers were favored over others due to AI system biases. The process of taking AI from code to production required a tight collaboration between data scientists and DevOps within an organization, which could be complex and time-consuming. Additionally, managing the infrastructure for machine learning operations (MLOps) was a significant burden, particularly in terms of resource allocation and dealing with traffic spikes. BasisAI also needed to ensure robust monitoring of AI models to prevent downtime and manage cloud consumption costs. Finally, ensuring data privacy and security was crucial, especially for customers in regulated industries.

Read More

BasisAI is a technology company founded in 2018, based in Singapore. It offers end-to-end AI services, from consultation to fully managed machine learning operations (MLOps), on its proprietary operating system Bedrock. The company works with large enterprises across various industries, including financial services, insurance, healthcare, and transportation, at every stage of their AI journey. The founders of BasisAI are tech veterans who aim to help large enterprises unlock the power of AI. They developed Bedrock, a platform that helps customers build and manage responsible AI applications, from development to deployment.

Read More

BasisAI developed Bedrock, a cloud-based platform-as-a-service (PaaS) that helps enterprises quickly deploy responsible AI. Bedrock provides tools for MLOps, enabling data scientists and operations professionals to collaborate on managing the machine learning production lifecycle. BasisAI operates Bedrock on Google Cloud using Google Kubernetes Engine (GKE) and stores log messages from customer projects with Cloud Logging. The company also uses Security Command Center to gain visibility into security issues. Bedrock enables MLOps practices, reducing the time-to-market of machine learning (ML) systems by up to 70% by automating workflows for training, reviewing, and deploying ML systems. BasisAI moved to a fully managed environment with GKE and uses an infrastructure-as-code tool Terraform with Google Cloud to create and manage all aspects of customer projects. Autoscaling on GKE allows BasisAI to dynamically allocate resources to customer projects, saving costs on compute resources. Bedrock also provides built-in governance capabilities, providing visibility on all parts of the AI workflow and promoting fairness.

Read More

The implementation of Google Cloud and the development of Bedrock have resulted in significant operational benefits for BasisAI. The company has been able to simplify the development experience and reduce the administrative burden, allowing data scientists to focus more on modeling. The use of Google Kubernetes Engine has enabled BasisAI to dynamically allocate resources to customer projects, saving costs on compute resources and reducing the time spent on infrastructure work. The company has also improved its security posture, identifying and resolving potential threats from misconfigurations and compliance violations. Furthermore, BasisAI has been able to provide its customers with full transparency on how each AI model works, promoting fairness and preventing unwanted biases. The company has also been able to test code before rolling into production, reducing the risk of issues in the deployment phase.

Speeds up business impact for customers deploying ML models from prototype to production in minutes, not months

Reduces infrastructure burden by 25% with machine learning operations (MLOps) that enable data scientists to focus on modeling

Reduces the time-to-market of ML systems by up to 70%

Download PDF Version
test test