Simplifying Intelligent Search Deployments Through Robust Information Architecture

Simply put, how much effort, energy, and forethought you invest in the content strategy and overall information architecture of your source systems significantly impacts the complexity and time needed to deliver a robust comprehensive enterprise search solution. Customers often embark upon their journeys with our products with the upfront understanding that enterprise search and the products that make it possible can be and often is a complex problem. While our products such as SmartHub, AutoClassifier and Smart Previews are immensely powerful and flexible, they are inherently as intricate as the problems they are designed to address.

In this post I will explain how the natural evolution of content systems over time leads to an information architecture “ah ha” moment where organizations are faced with a fork in the road: either adopt sound best practices and redevelop their platforms in a more scalable service-oriented fashion, or face the ever-evolving needs of the business with a chaotic and unmanageable situation that requires an infinitely more complex and support-intensive search strategy.

Regardless of your underlying content sources, be it SharePoint or connected systems of any variety, the necessity of information architecture has long been the hallmark of simplifying distributed content management and collaboration project deployments. The natural evolution of these projects from an enterprise perspective tends to follow a relatively well-established progression that has been observed over the course of years across many different industries. The typical progression involves the standup of systems that eventually becomes adopted organically over time as the organizational capability slowly grows. Once the critical mass and criticality of these systems is apparent, central IT often becomes a more prominent figure in their management and support. This change inevitably leads to a review and expansion of these platforms into what begins to look like an actual services deployment, though typically there is a reluctance to upend the content for fear of disrupting end user productivity. Eventually these services, now far more recognized for their value to the business, become wholly unmanageable from a central IT standpoint. The ultimate realization that central IT can no longer support, manage, deliver, and secure content on behalf of the users in a sustainable way leads to the recognition that the resulting mess is a mess well worth cleaning up. This is the first time a wholesale review of the usage, content, and long-term portfolio strategy involving these systems is undertaken, as well as the first time widely accepted best practices involving these systems are explored in detail.

This is the crucial decision point that becomes the fulcrum for how complex their fast approaching enterprise search initiatives will become. As the redeployment of the underlying systems along the lines of established best practices begins, the age-old question of information architecture becomes a front and center concern. The various siloed solution-centric departmental deployments must be redeployed on what becomes a genuine enterprise service platform deployment of these technologies. This allows for a level of standardization regarding taxonomy, classification, metadata, and structure that previously wasn’t possible. The arrangement of the content, while appearing more important in the near term, eventually yields to the strategic importance of delivering, enforcing, and maintaining a level of information governance and standards across whatever the resulting arrangement of content may become. This level of maturity and architectural forethought provides many long-term advantages including adaptability, migratability, and the ease with which content can ultimately be aggregated and enriched, thereby allowing users the ability to quickly find, refine, and consume content across multiple underlying content systems without the need to fully understand the nature of each system’s deployment or evolution.

So, what does all this mean in practical terms from a deployment standpoint and how does this impact the complexity of the overall search strategy? Let’s take SharePoint as an example to start with. When SharePoint is delivered as a service, using a syndicated and well communicated set of content types, columns, and enterprise managed metadata, the resulting expression and usage of these elements for content enrichment and classification is approached very similarly across divisions, departments, or teams within the organization. Common refinable properties are mapped and reused from an enterprise standpoint, thereby allowing for users to filter and explore content from a variety of sources and sites they may have access to without the need to fully understand the arrangement of that content. This type of standardized approach also provides for the simplification of setup and configuration/customization with respect to search and your chosen BA Insight software components.

For example, if you have a handful of syndicated content types and columns, which are used in conjunction with an enterprise taxonomy expressed through managed metadata services, the resulting implementation of BA Insight products such as AutoClassifier and SmartHub will be significantly streamlined and easier to upgrade and maintain. Again, this is simply because all of the content sites in the deployment are leveraging a common set of well-established elements for classification. This makes the creation and mapping of managed and crawled properties a front-loaded activity rather than a constant challenge that needs to be kept up with on an ongoing basis. It also leverages the power of semantic rules-based classification and metadata population capabilities of AutoClassifier’s annotation features.

This concept of developing a simplified and easy to understand enterprise standard for content classification and metadata entry can be expanded across multiple systems and services as well as evolve over time to meet the ongoing needs of the business. BA Insight’s connector framework, which seamlessly integrates your chosen back end search platform with other line-of-business applications, makes leveraging such a shared standard easy for content enrichment as well as integration with external services such as Natural Language Query and AI.

I realize these concepts appear somewhat ethereal and almost vague from a tactical standpoint. As an experienced enterprise architect having witnessed first-hand the shortfalls of failing to fully consider the strategic value of taking the time needed to formulate such standards, I can assure you the investment not only pays off long term but is a necessary pre-requisite for any serious enterprise search project.

To learn more about any of the BA Insight products mentioned in this post, please visit our products page or contact us for more information about your specific needs, to schedule a demo, or to inquire about formulation of a proof-of-content (POC) based on the specifics of your unique environment and data requirements.

Leave a Comment

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

If you agree to these terms, please click here.