[Progress News] [Progress OpenEdge ABL] Introducing MarkLogic Server 12: Built for the GenAI Era

Status
Not open for further replies.
M

Mitch Shepherd

Guest
MarkLogic Server 12 powers intelligent search for enterprise AI so you can innovate faster, scale smarter and empower users and LLMs with the right information.

MarkLogic Server 12 is here, and it’s a bold step into the AI-defined reality of today and its promise of tomorrow. With this release, we’re delivering the most flexible, intelligent, scalable and secure platform yet, built for the demands of modern applications and generative AI.

Whether you're developing AI-driven systems, exploring massive unstructured datasets or modernizing legacy architectures, MarkLogic Server 12 introduces powerful capabilities to accelerate what’s next in enterprise AI and decision-making.

The highlights? Native vector search and Virtual Views—two powerful new features that dramatically expand how you can query, analyze and extract value from your content. Here are more things you can do with the latest updates:

  • Maximize GenAI Response Accuracy with true hybrid search
  • Answer Business Questions Faster with ad-hoc analytics at scale
  • Meet Modern Application Demands with optimized scaling for peak workloads
  • Enhance Your Security Posture with OpenSSL 3.3 and TLS 1.3 support
  • Modernize Your Stack with improved platform support

Let’s dive in.

Intelligent Search Experiences Start Here: Native Vector Search​


Search has always been at the heart of MarkLogic Server. With version 12, we’re taking it to the next level. Vector search is now built into the core of the platform, enabling semantic similarity retrieval that’s essential for Retrieval-Augmented Generation (RAG) and other GenAI use cases.

Unlike traditional keyword search, vector search lets you retrieve results based on meaning—not just matching words. You can store vector embeddings directly inside JSON or XML documents, enabling large-scale similarity search on unstructured content. Documents or parts of documents can be represented as high-dimensional vectors and searched for conceptual closeness.

Vector search works seamlessly with your data foundation. The MarkLogic platform blends the enterprise-grade capabilities you need for AI and mission-critical applications. That includes the efficiency of the index, the flexibility of the algorithm and the baked-in robust security. We built so much resilience in the design of the platform architecture, so you can scale your systems to effortlessly meet future AI and production environment demands.

But we didn’t stop there. MarkLogic Server allows you to combine vector search with full-text, semantic, geospatial and SQL queries—all in a single API. This hybrid approach brings the best of lexical and dense vector results so that your AI systems retrieve the most relevant, trustworthy information every time. The result: high-precision, truly multi-modal search that can power intelligent, contextually-aware search and AI applications that understand your users and your world.

Use cases for vector search include:

  • Retrieval-Augmented Generation (RAG): Surface the most relevant knowledge to power LLM responses.
  • Personalized Recommendations: Go beyond clicks to suggest content users actually want.
  • Similarity Search: Find conceptually related records, even when the wording is completely different.
  • Risk Management: Detect patterns and connections early across unstructured narratives.

Faster Analytics Deployment. Zero Reindexing.​


The MarkLogic platform has long supported powerful analytics through Template Driven Extraction (TDE), allowing structured SQL-style access to document content. Now with Virtual Views, we’re unlocking a new level of flexibility and performance for analytics at scale.

Virtual Views is a powerful new capability that lets you project unstructured data into relational views at query time. This opens the door for new large-scale analytic workloads in real time as well as more flexible data exploration where speed and agility matter the most. Using views that are generated on demand without reindexing or materializing all the data in the database saves you the time and cost of generating the indexes on columns in your views that you may rarely have to query on. This means faster delivery cycles for ad-hoc reporting, lower storage costs and more responsive data exploration.

And when needed, you can still combine them with materialized views for performance-critical queries. The query engine automatically optimizes to leverage the universal index to quickly identify only the subset of documents that can answer the question at hand. Whether you’re prototyping a dashboard, validating a hypothesis or building a dashboard, Virtual Views helps you move from question to insight quickly.

Here's where Virtual Views shine:

  • Large-scale, ad hoc analytics: Analyze massive datasets without upfront storage costs.
  • Data discovery and prototyping: Create relational-style views on the fly with zero pipeline changes.
  • Optimized performance: Join Virtual Views with materialized views for a balanced approach to speed and flexibility.

Built to Scale, Built to Meet Modern Application Demands​


Personalized conversational experiences require massive real-time unstructured data processing, placing substantial demands on application usage and computational resources. MarkLogic Server 12 includes dozens of enhancements aimed at improving developer experience, performance and operational control. From expanded language support to UI upgrades and more flexible security configurations, every improvement helps teams move faster and build smarter.

MarkLogic Server 12 introduces Dynamic Hosts, allowing you to scale compute resources up or down based on demand. This ensures consistent performance during usage spikes while optimizing infrastructure costs.

And because security is non-negotiable, we’ve hardened the platform with support for OpenSSL 3.3, TLS 1.3, Federal Information Processing Standards (FIPS) 140-3 readiness and upgraded operating system (OS) support so you can take advantage of the latest technology.

Support for Red Hat Enterprise Linux 9, Amazon Linux 2023, Windows 2022, Rootless Docker and an upgrade of the embedded JavaScript engine enable developers to use modern tools and frameworks, leverage improvements in performance and support future innovation and AI-driven applications across multi-cloud, cloud-native or hybrid environments.

Conclusion​


Organizations are under pressure to move faster, make smarter decisions and do it all with less risk. MarkLogic Server 12 is designed to meet that challenge. It’s a multi-model data platform that brings together structured and unstructured data, advanced search and AI-ready capabilities—all in one place.

Whether you’re building intelligent applications, scaling AI projects or modernizing your data architecture, the MarkLogic platform is your launchpad.

Continue reading...
 
Status
Not open for further replies.
Back
Top