Architectural Services | Resources | Views | Contact Arctec Group

Arctec Views
January 27, 2004

In this issue:

    * Enterprise Data Architecture, Part II
    * Enterprise Architecture News
    * Reader Views
    * Arctec Group News:
     Arctec Group CTO Gunnar Peterson is featured
     speaker at BlackHat Windows Security Conference


Arctec Group is an architectural services company focused on Enterprise Architecture issues. With this newsletter, we aim to serve our clients, partners, and colleagues by providing our view on current issues and best practices in Enterprise Architecture as well as aggregating interesting news from around the globe. We hope you find the newsletter useful and enlightening. We would like to hear your thoughts on current affairs and ideas to improve this offering. Subscribe and unsubscribe information is at the bottom of this email. Previous issues are available


Enterprise Data Architecture: The afterthought, Part II

Welcome back for the second installment on Enterprise Data Architecture: The afterthought. I hope you had a wonderful holiday season. The second part of this Article will focus on some specific techniques that can assist organizations in successful data architecture. Before we discuss these, I would like to walk through some high level steps to get your organization started with Dimensional Modeling.

When considering successful data architecture, first look at identifying the core business functions and events that support enterprise-wide strategic objectives. Strategic objective examples might include becoming the largest company by market share within your defined industry space, or becoming the technology leader within your industry space. These strategic objectives should translate into goals such as increased profitability, improved customer service, increased product/service value, etc. Regardless if your company competes on price, customer service, or product/service excellence, you should be able to translate your strategic objectives into goals with actionable measures that allow your business, with the use of information, to better track and manage performance. In Part 1 of this article, we discussed an approach to define the relationship between business measures and the decision support context that is associated with them. Business Dimensional Modeling allows you to specify these measures in raw form. In Business Dimensional Modeling we call these Facts.

Now that you have defined the actionable metrics that your business will utilize to measure its performance, it is time to determine the contexts of your business environment that will help you better understand and analyze what these measures mean to your business. Business measures make little (or no) sense without some context associated with them. Business Dimensional Modeling is an approach that allows you to define this logical representation and its association with defined business measures. This includes who, what, when, where, and why context associated with business events.

Next let's take an inventory of current operational systems that are in place to support core business functions and track business events. Do you see the business context association with these systems? Is the system in place for data administration or transaction management? Data administration system types usually relate to the management of business context or Dimensions. Examples might include customer management, product/service management, contract management, location management, etc. Note: the act of data administration itself can sometimes be translated into facts that the business deems valuable to measure. Transaction management system types usually relate to completing and tracking a core business function or event. Examples might include product/service transactions, purchasing transactions, process transactions, etc....

Now that we have defined our Business Dimensional Model at a high level; let's discuss some technical design techniques to more easily manage your data. The appropriate technical design techniques could be influenced by the need for decision speed, knowledge management, partner integration, information self serve strategy, the simplification and efficiency of the enterprise, or other factors.

These techniques include but are not limited to the following areas of consideration:

- Operational System Data Architecture Techniques

o Universal Key Management 
When defining a need to have multiple disparate systems to gain efficiencies through distribution of energy and resources, there is still a need to bring this information together for information analysis. Universal Key Management is a method to manage physical keys across these disparate systems in a type of Federated Architecture. This allows for high efficiencies in data management and accuracy in logical data representation.

o Metadata Management
Metadata management is needed to understand the lifecycle of data and what it means to a business. I cannot tell you how many times I have seen organizations capture raw information without understanding its meaning or how it can be used to improve performance. An analogy of the value of metadata is to compare the intelligence information captured by governments to monitor, track, and prevent terrorism. We have recently seen how this information has been inaccurately valued because of missing or not provided metadata about the intelligence. With the inclusion of metadata about how that intelligence information was received, the statistical validity from specific sources and methods, and other descriptive information around this intelligence, the government can be better informed to make better and timelier decisions.

o Standard Industry Supported Descriptors/Semantics
When defining the logical entities an organization chooses to track to support operational business as well as information analysis, it isimportant to try to use Du jour, De-facto, or Industry specific standards. By taking this approach, the organization is set up for easier data integration between the internal systems and partner/supplier/exchange systems, and is better positioned to takeadvantage of off-the-shelf solutions (as vendors are driven to build toward standards). For example, within the Healthcare industry there are a number of standards that organizations are either forced to leverage or should attempt to leverage such as HIPAA, NCPDP, HL7, and others. One of the most difficult things to get through when dealing with data is semantics, and often the easiest way to get over this hurdle and remove any political/power struggles is to adopt an external standard. In addition to providing a common set of logical semantics, it should assist your organization in improving its position within a given industry. The pragmatic challenge always faced in this approach is typically deciding from an efficiency and change management standpoint how far “inside” your core systems to carry thestandard (a decision that will be based on the nature of your systems). Note: There may be understandable business reasons not to support a standard, such as a De-facto standard created by your competitor, or an inherently inefficient standard created by non-operational entities (companies whose business does not rely onthe actual effective implementation of the standard – not having to "eat their own dog-food").

o Information Change Management
This is a personal favorite of mine. For any process to understand changes within the source system, the source system should capture this information. This is a huge issue with both off-the-shelf as well as custom developed systems. I recently ran into a leading off-the-shelf order management package that had a number of information change management issues. For starters there were a number of transaction-based tables that changed over time with no update timestamps captured on the records. The tables that did have update timestamps on were not correctly considering parent child referential integrity when writing the records. Additionally, the system would do hard deletes related to specific information (a one-way trip!). A much better technique would be to do soft deletes, such as using statuses to capture the current validity of certain information. Finally, some of the lookup tables had primary keys that were not being used for referential integrity. The package instead decided to store textual codes/descriptions in place of these primary keys. This is a huge issue when attempting to do information analysis.

In next month’s edition, Part III of this series will look at the Information Management and Delivery aspects of Enterprise Data Architecture

- Charles Belisle
Enterprise Architect
Arctec Group


Enterprise Architecture News

Power of Community Networks
Lawrence Lessig's essay comparing fiber network ownership with cable and other related industries.

IBM Open Source Payment Patent
IBM has received a patent for the payment process which compensates Open Source programmers.

Worms and Mass Destruction
Economist article which alludes to the business reality of security breaches and some possible remediations.

Hacking on Capitol Hill
Since 2004 is an election year this will probably not be the last story of hacking amongst political rivals

Open Source on the Local Scene

Novell Joins Eclipse Project
Novell continues its advances on the emerging an open source communities.,4149,1455604,00.asp


Reader Views

"I can relate to your latest 'view' (Enterprise Data Architecture: The afterthought, Part 1) having spent several years at a firm with an enterprise scale GIS, with a lackluster EDA, that was developed expressly for decision support needs. It is hard to fathom such an inadequate devotion towards such an essential need.

Some thoughts on your question 'Why is Enterprise Data Architecture the afterthought':

- Ignorance (e.g. lack of adequate knowledge around EDA, especially in the hands of decision-makers).
- Arrogance (e.g. the 'Ivory Tower' approach).
- Awareness of the importance of EDA but improper guidance for its implementation and inadequate visioning for its future.
- Evolution (often rapid) of data needs and/or improper monitoring of the existing EDA which results in exceeding the capability of the technology and people within organization.

Some other useful EDA sub-topics would be: 1) 'how to' suggestions on fixing an EDA that is already in place, and 2) impacts of a poor EDA on operational costs and revenue generation."

-Patrick Twiss


Have your say

Agree? Disagree? Insufficient data to judge? Email us at, we want to hear from you. We will publish your name or anonymize your response as requested.


Arctec Group News

Arctec Group CTO Gunnar Peterson will present a talk at the Blackhat Windows enterprise security conference on January 30th. The presentationis on "Security in the Software Development Lifecycle". The focus of the presentation is on specific design, process, and organizational elements and acitivies to understand and improve security within the enterprise. Particular attention is paid to mapping the strategic business vision to tactical and implementable solution.


Arctec Group: Strategic Technology Blueprints
Arctec Group Newsletter is a free monthly newsletter. If you would like to subscribe to Arctec Views, simply send an email to views@arctecgroup.netfrom the email account you would like to receive the newsletter. Please include the word "subscribe" in the subject or first line of the email. If you would like to unsubscribe to Arctec Views monthly newsletter, simply send an email to views@arctecgroup.netfrom the email account you would like to unsubscribe. Please include the word "unsubscribe" in the subject or first line of the email.

Copyright © 2004 Arctec Group, LLC All Rights Reserved