Photo credit: RawPixel/Freepik
With AI being embraced by various industries, the cloud data service provider sector has begun making its move to make sure it does not miss out on this opportunity. By implementing artificial intelligence features into their products and services, the data cloud service providers are hoping to drastically reduce the inefficiencies and provide a more AI-driven result.
In the global market of technology and data cloud, Snowflake, an American cloud-based data storage company, also introduced updates for its Cortex AI, enabling users and companies to grab on the AI-featured solutions.
According to the announcement, Snowflake Cortex AI is built to help everyone within an organization, and users easily gain access to large language models (LLMs), allowing them to build and deploy AI-powered applications.
To understand more about the Cortex AI, The Byteline interviewed Snowflake’s Mohamed Zouari, the General Manager, META, and Aamer Mushtaq, Regional Solutions Engineering Manager.
How does Snowflake's introduction of Native Apps and Cortex aim to transform the way businesses integrate and utilize data across different platforms?
Mushtaq: Let's first talk about what the challenges are with the application. There are two parties involved in any application. First, the consumers, and then the providers. The consumers have a couple of challenges. First, they need to trust the providers completely in how the application is created. If they create their own application, they need to manage and maintain it themselves as well.
Providers, on the other hand, have challenges around doing upgrades, versioning, and getting the trust of the consumer that the application doesn’t have anything malicious inside. So, those are a couple of challenges.
From a deployment point of view, applications have different stacks. There's a complete application stack, and the data is usually stored in a different tier, which creates challenges around manageability.
The idea of native apps is to bring the application to the data. This simplifies the way applications are deployed. We bring it to a Snowflake account owned by the customer, so they know exactly what data the application is accessing.
Providers can also see what portions of the application the consumer is accessing, which results in higher trust for both parties. It also makes it easier for the provider to upgrade and perform versioning for the application without facing downtime. That’s really the benefit of native apps we’re bringing to the market.
So, the Cortex is, again, the same logic that we are applying with the application. We're bringing AI to the data, making it simpler for organizations to embed AI functionality on top of the data foundation that they're producing.
Can you explain the significance of the Snowflake Gen AI & LLM feature, and how it differs from existing AI-driven data solutions in terms of enhancing business decision-making?
Zouari: Cortex AI is our Gen-AI and LLM offering. The idea is the same as with the native apps — making things simpler, building the data in one place, and bringing the workloads to the data.
It’s important that we provide our customers with the best data foundation. When topics like Gen-AI and LLM came into the market, instead of our customers looking for other platforms to handle these workloads, we brought them to the Snowflake platform.
This maintains our philosophy of simplicity, security, and governance and also helps reduce time to market. Gen-AI is not just hype; it’s a reality. Everyone is looking to transform customer interactions and make decisions in real time.
The capability to execute quickly and demonstrate results in less than 15 minutes, as we show customers every day, is key. If you're building a platform for a year, your competitors will have already proposed hundreds of solutions using Gen-AI and LLM through platforms like Snowflake.
What industries or business sectors stand to benefit the most from Snowflake's new advancements, particularly the Gen AI & LLM feature?
Zouari: It’s not industry-specific, honestly. Anyone can benefit from Gen-AI and LLM. It’s more about use cases. We are a B2B business.
Our customers will build solutions for their customers. We’re not like Chat GPT, for example. But some major use cases are emerging across industries, such as automated business intelligence.
Instead of building dashboards and relying on self-service, which is now commonplace, our customers need to talk to their data. They can interact with their data in natural language, which can be useful for any employee in finance, legal, or HR.
For example, someone can ask questions on contracts to understand certain clauses or analyze data at any scale. This capability can be applied across industries.
Another use case is sentiment analysis of customer feedback. Call centers and support channels generate massive amounts of interactions, and it’s hard to manually assess the sentiment behind them all.
Using Gen-AI and LLM, customers can analyze this data at scale and define next actions to improve products or customer satisfaction. AI-driven chatbots can handle 90% of questions, freeing up human agents to focus on more complex cases.
With Snowflake’s focus on AI and large language models, how do these tools address the growing demand for real-time analytics and predictive insights?
Mushtaq: Snowflake is a comprehensive data platform. We bring in data from different sources and apply AI on top to allow customers to do various things, like sentiment analysis or conversing with their data.
Customers, especially non-technical users, can interact with their data in their preferred language and make decisions based on the insights they gain. We’re also providing APIs as part of Cortex, which lets customers build custom applications like chatbots or search tools, extending what business analysts can do with LLMs.
With the rise of AI and LLM solutions, many users, including regulators, are concerned about data privacy and security. How does Snowflake ensure data privacy and security with the integration of advanced AI models, especially considering the complexities of large-scale business data operations?
Zouari: This is a key point. Snowflake is a one-stop shop for all your data, and that includes AI workloads like Gen-AI and LLM.
When you build your data platform, you're already setting security and governance rules. So, instead of adding complexity with new platforms, you’re inheriting security from your data foundation.
Our customers can trust that their data and the questions they ask won’t be used by anyone else because everything stays within the boundaries of their account. This ensures data privacy, and it’s very important that customers can trust our solution to interact with their data without fear of it going public.
With the recent AI treaties and regulations being discussed worldwide, how is Snowflake planning to navigate through this changing regulatory environment?
Zouari: Absolutely. Let's take a step back to focus on the data, as AI relies on data. Snowflake delivers a fully managed service that complies with regulations like GDPR and the Cloud Act. We also ensure contractual compliance with customers in our terms of service.
As new AI regulations come into effect, we will work with legal teams and product engineering to ensure we meet those standards. We recently amended our terms and services to accommodate AI regulations.
Mushtaq: We’re also thought leaders in this space. Our compliance team published an AI security framework to help the industry think about the risks and challenges associated with AI. This framework outlines how to build applications that minimize risks like hallucination and maintain security. So, we published that white paper as a research document for the rest of the community to benefit from.
With our Snowpark container services, customers can bring any model they like, whether it's an open-source model, into their secure Snowflake environment. We guarantee that the model and anything it does will remain within their environment without sending any data to the outside world. This creates more trust and allows them to leverage AI with greater confidence."
What are Snowflake's future plans for the next five years?
Zouari: Snowflake's story began 12 years ago as the first cloud-native data warehouse. From the start, we aimed big, building a foundation for a highly scalable platform, both horizontally and vertically.
This means incorporating any new technology and evolving from just data warehousing to data laying, data engineering, data science, and now JDI ML, building applications, etc.
Horizontally, we adapt to new technologies like moving from just data warehousing to doing data laying, data engineering, data science, and now JDI ML, building applications, etc.
Vertically, we scale with the volume of data and compute. Today, we run more than 5 billion queries per day on 10 gigabytes of data.
We are continuing on this path, staying up-to-date with the latest technologies and bringing all data and AI-related workloads to our platform. This year, specifically, all these products are going live on GA. This is a huge year for us, as everything we've built over the past months is now in the hands of our customers.
Mushtaq: Just to emphasize what Mohammed said, Snowflake has been innovating from day one. Innovation is in our DNA. We've disrupted the data warehousing market, the application development market, and the development of data.
Our next frontier is AI, specifically generative AI. Recently, in Italy, we did a research survey in collaboration with MIT to explore what's next and identify upcoming industry trends. We're looking at how large organizations should create their data strategies to tap into AI and its future possibilities. It's a continuous journey, and rest assured, Snowflake will be at the forefront.