-
Engineering · Data Science
Metasense V2: Enhancing, improving and productionisation of LLM powered data governance
In the initial article, we explored the integration of Large Language Models (LLM) to automate metadata generation, addressing challenges like limited customisation and resource constraints. This integration enabled efficient column-level tag classifications and data sensitivity tiering. With the model initially scanning over 20,000 entries, we identified areas for improvement post-rollout. These advancements have significantly reduced manual workloads, increased accuracy, and bolstered trust in our data governance processes. -
Engineering
How we reduced peak memory and CPU usage of the product configuration management SDK
Learn about GrabX, Grab’s central platform for product configuration management. This article discusses the steps taken to optimise the SDK, aiming to improve resource utilisation, reduce costs, and accelerate internal adoption. -
Engineering · Data Science
LLM-assisted vector similarity search
Vector similarity search has revolutionised data retrieval, particularly in the context of Retrieval-Augmented Generation in conjunction with advanced Large Language Models (LLMs). However, it sometimes falls short when dealing with complex or nuanced queries. In this post, we explore our experimentation with a simple yet effective approach to mitigate this shortcoming by combining the efficiency of vector similarity search with the contextual understanding of LLMs. -
Engineering · Analytics · Data Science
Leveraging RAG-powered LLMs for Analytical Tasks
The emergence of Retrieval-Augmented Generation (RAG) has significantly revolutionised Large Language Models (LLMs), propelling them to unprecedented heights. This development prompts us to consider its integration into the field of Analytics. Explore how Grab harnesses this technology to optimise our analytics processes. -
Engineering · Data Science
Evolution of Catwalk: Model serving platform at Grab
Read about the evolution of Catwalk, Grab's model serving platform, from its inception to its current state. Discover how it has evolved to meet the needs of Grab's growing machine learning model serving requirements. -
Engineering
Enabling conversational data discovery with LLMs at Grab
Discover how Grab is revolutionising data discovery with the power of AI and LLMs. Dive into our journey as we overcome challenges, build groundbreaking tools like HubbleIQ, and transform the way our employees find and access data. Get ready to be inspired by our innovative approach and learn how you can harness the potential of AI to unlock the full value of your organisation's data. -
Engineering
Bringing Grab’s Live Activity to Android: Enhancing user experience through custom notifications
Unleashing Live Activity feature for iOS. Live Activity is a feature that enhances user experience by displaying a user interface (UI) outside of the app, delivering real-time updates and interactive content. Discover how its was solutionised at Grab.
Engineering
How we reduced initialisation time of Product Configuration Management SDK
Discover how we revolutionised our product configuration management SDK, reducing initialisation time by up to 90%. Learn about the challenges we faced with cold starts and the phased approach we took to optimise the SDK's performance.