Anywhere Anytime Data Delivery …

Data Management Services

We are living in the information age, the age where data integrity and real-time accessibility high-quality data about everything happening in real-time can mean the difference between profit and loss.

Data are a real and valuable company asset, however due to technical constraints it’s not accessible and utilized as extensively as it should be to leverage the business.

Being able to aggregate and access the data effectively can bring a huge benefit and enable competitive advantage to any business unit growing their business.

Both the methodology and the technology now exist to value and access data in all ways possible getting reed of existing constraints.

Data Management disciplines include:

  1. Database Architecture
  2. Data Architecture (Conceptual, Logical, and Physical Data Modeling)
  3. Data Analysis
  4. Data Warehouse
  5. Data Mart
  6. Business Intelligence
  7. Master Data Management
  8. Data Quality Improvement

When architected, designed and managed properly, master data enables improved decision making, improved customer relationships, and enables competitive advantage.

Enabling an optimal, highly available, and scalable data/information layer for any organization requires the careful planning, design, and implementation of their data components that best match their current and future data delivery needs. This often spans several tiers, may be distributed across multiple sites, and must be tailored and tuned to provide optimal data access, high performance, and the highest data integrity. Our deep experience with most database engines, with complex data architectures, and with high volume data accesses allows us to provide a series of “data services” to our customers that deliver those results, time and time again.

Core data categories such as customer data and product data comprise the cornerstone of this practice offering. A specialized MDM methodology geared towards an optimal MDM architecture and solution is embodied in our delivery model for MDM. Often taking advantage of hub and spoke architectures to enable MDM for major organizations.

Customer Data Hub Methodology

1. MDM Customer

  • Governed/Owned  by the Business (steward)
  • Technically enabled  by IT (custodian)

Must have a Customer Identity Strategy!

2. Registry Technique

  • Authoring at Spokes
  • Cross Reference Only (attributes not mastered in hub)
  • Provides links to sources (that may not share the same data model)
  • Non-invasive (easier to implement, but less attribute consistency)

3. Co-existence Technique

  • Authoring at Spokes AND at Hub possible (not subscription)
  • Cross References and Golden Record stored at hub
  • Maps attributes to common data model
  • Extended Attributes
  • High Attribute consistency

4. Transactional Technique

  • Authoring at Spokes AND at Hub possible (not subscription)
  • Cross References and Golden Record stored at hub
  • Maps attributes to common data model
  • Extended Attributes
  • High Attribute consistency


Customer Data Hub (CDH) Build Methodology

1. Data Analysis/Data Assessment ** SPOKES **

  • Def’s, Models, Attributes
  • Use cases/Data accesses
  • Volatility/Frequency/Velocity
  • Data Quality assessment
  • Dependencies Upstream/Downstream
  • Rules being applied
  • Standards being applied
  • Logic being applied
  • What we have and what we need

2. Data Analysis/Master Data Model ** HUB **

  • Def’s, Models
  • Identify Core Attributes and Relationships (scope)
  • Use cases/Data accesses starting with CRUD
  • Understand the data Volatility, Frequency, Velocity
  • Identify cross Reference & Registry needs
  • Identify Extended Attributes
  • Start identifying the rules that we need applied
  • Start identifying the standards that we need applied
  • Start identifying the logic that we need applied

3. Define Business Logic/Workflow ** HUB **

  • Identify and map out the hub based business logic needed
  • Validate that all Use cases and Data accesses are addressed
  • Factor in Volatility, Frequency, Velocity
  • Clearly identify all major Workflows (automated or one’s with human interface)
  • Dependencies identified
  • Identify rules logic to be applied at the hub (cleansing rules, so on)
  • Identify standards to be  applied at the hub

4. Define Participation Model  ** HUB/SPOKE **

  • Identify and define how each spoke interacts with the hub AND with each other (termed participation model)
  • Cleary identify and define each inbound and outbound behavior in terms of publish, subscribe (provider/consumer)
  • Remember, we are defining a microcosm of organisms that must now live together (not a silo)

5. Define Overall/Broader Architecture Participation

  • Clearly identify how the MDM customer data is to be utilized in the broader company architecture
  • Examples are with ODS, Sales, Marketing, Finance, EDW, WS’s, SOA, so on.
  • This new microcosm must now fit into the broader universe of your other systems

6. Define Governance, Stewardship, and Business Organization

  • Clearly identify how the MDM of Customer data is managed from the business side (process, workflow, ownership, coordination, and with a liaison into IT – the custodians)
  • Create a stewardship model and organization. This may include a steering committee that acts as a policy maker and compliance arm of this key data category

7. Deliverables and Artifacts

a. Metadata/Model

  • Core attributes to be managed
  • Party-based mappings (hub/spoke)
  • Cross Reference Identities/Registry
  • Ownership model
  • Data Models (hub/spokes)
  • life cycle (archive, purge, availability)

b. Business Logic

  • Workflow
  • Merge, match, Dedupe logic
  • Standardization, cleansing
  • Data sync needs
  • Mappings/context
  • Transformations needed
  • Logical/physical merge approach
  • Frequency/Velocity requirements

c. Participation Model

  • Inbound/Outbound definitions
  • Contributing Attributes from each spoke to the hub
  • overall publishing/subscribing needs (frequency/volatility)

d. Broader Architecture

  • Other system interfaces (Upstream/downstream)
  • General exposure methods (WS, API, Services)


Database Architechs, has been operating in the United States since 1991 and in Europe (Paris, France) since 1998, offering unrivaled database design, data modeling, data architecture, business intelligence, distributed data/replication, performance & tuning, high availability, data security, and master data management consulting to its global client base.

This core data expertise consists of some of the world’s top database experts and our clients have included Intel, Cisco, Apple Computers, Wells Fargo, PG&E, Visa International, Charles Schwab, Toshiba, Sony, and many other global 5000 organizations.

“Data by Design” is our mantra and this drives each project – from early data requirements all the way through optimized database designs and implementations.

Our database engine coverage includes Sybase Adaptive Server, Microsoft SQL Server, Oracle, DB2, MySQL, and PostgreSQL to name a few.

Our database services also include several outstanding Database and SQL courses, and a graphical Performance and Tuning database product called SQL Shot for Sybase, SQL Server and Oracle DBMS platforms.

We are the authors of Sybase’s performance and tuning and physical database design methodologies (and courseware) and some of our expert data professionals are noted authors of bestselling books such as:

  • Microsoft SQL Server 2000 Unleashed. Bertucci, Paul (SAMS – December 1, 2002).
  • Microsoft SQL Server 2005 Unleashed. Bertucci, Paul (SAMS – April 20, 2007).
  • Microsoft SQL Server 2008 Unleashed. Bertucci, Paul (SAMS – Coming Soon – May 2010).
  • Microsoft SQL Server High Availability. Bertucci, Paul (SAMS – November 5, 2004).
  • Teach yourself ADO.NET in 24 Hours. Bertucci, Paul (SAMS – May 10, 2002).
  • Cryptography in the Database: The Last Line of Defense. Kenan, Kevin (October 19, 2005).
  • Sybase SQL Server 11 Unleashed. Rankins, Ray (SAMS – April 16, 1996).

Moreover, we have authored database courses for the benefit of the database companies themselves, including the following: Sybase’s “Performance & Tuning Methodology”, Sybase’s “Physical Database Design”, and Dr. Peter Chen’s “Entity/Relationship Modeling and DB Design”, to name a few.

Database Architechs is based in the Pacific Northwest with offices in California and Oregon and serves most of Europe from Paris, France.