Wednesday, January 12, 2011

Using AHP for Architecture Assessment

Any development starts with definition of core areas that drive and support the structural aspects of solution. Software development is no exception to this and has architecture as essential need. Unlike most of the visibly structured implementations, software architecture is invisible and is evolving at any given time. Assessment of architecture and relevant implementation at regular intervals is preferred to ensure evolving needs are addressed using right set of implementation, corrective actions and governance. Analytic Hierarchy Process (AHP) is a defined mathematical model used for communicating decisions at large using comparative and learned decision making inputs. In my experience, I realized it could be used for many more things than just decision making. Assessment (also kind of results in decision making but more of identifying gaps) can be structured better using AHP, using this forum I will try and explain the concepts and my experience with AHP for assessing application architecture using non-functional requirements as example.
Most of the assessments have following steps as common; I will try and explain the steps aligned with non-functional requirements assessment for architecture
1.       Identify stakeholders: some of the candidate example stake holders may include following
a.       Product shaping group/individual like product manager is mostly responsible for feature definition but also may provide some of the key non-functional requirements. It’s important to understand all such stakeholders.
b.      Enabling and supporting groups like security or network groups.
c.       Policy and governance implementation groups like legal and internal policies group, etc.
2.       Understand stakeholder needs (concerns) on architecture: sample needs are as below
a.       product manage requesting 3 second response time, 99.99 availability , etc
b.      Security group requesting vulnerability assessment to addresses OWASP top 10 concerns, authentication against active directory, encryption, etc.
c.       Network engineer requesting data replication, notification, etc.
d.      policy group requesting usage of appropriate licenses and tools
e.      Legal group requesting appropriate legal notes and agreements
3.       Device/refine baseline expectations to align with stakeholder and organization needs: initial stage requires defining of expectation. At later stage, the architecture state may change and might require refinement. Depending on the progress in implementation on has to select current expected state of architecture against which the assessment is to take place. Ensure these expectations are defined and are categorized to be identified independently. For example security might be split into authentication, authorization, transport security, etc.
4.       Prioritize assessment factors: each stakeholder concern would be translated into assessment area. For each area consider having multiple options evaluated against focused area. For example, authentication might take place using different options like Active Directory, Custom implementation, Single Sign-on, etc.
5.       If required device a sampling mechanism for assessment: In many case it may not be possible to assess entire architecture and code. Adopt a mechanism to sample certain portion of candidate implementation for assessment and increase the assessment coverage provided time permits or expectation on accuracy increases.
6.       Assess architecture against baselines stakeholder needs: At all the time, it is important to make sure the assessment takes place against baselined stakeholder needs.
7.       Prepare assessment report: This is first or repeated assessment report for early communication of the gaps and improvisation.
8.       Discuss assessment report with stakeholders and revise: It is better to present findings early to the stakeholders in order to seek inputs that can help in refining it further. The contents may include gaps, prioritization of fixes, risk identification, recommendations, course correction, etc.
9.       Present final report: Based on different inputs the final report would be prepared and presented. This would be more tuned towards stakeholder needs/concerns expressed in the early state versus the recent existing state of architecture.
In general, key areas for an assessment include
1.       Defined expected implementation
2.       Existing state for implementation
3.       Potential gaps between expected and existing implementation
Key outcomes of assessment include
1.       Gap analysis report
2.       Recommendations
Analytic Hierarchy Process (AHP) as a mathematical model fits best in the area of “Gap analysis report”. The recommendations on the other hand depend on the subject matter experts and their opinion, this can be structured better using AHP further but in this blog I will not delve into it.
In general AHP helps decision makers find a solution that best suits their goal and their understanding of the problem. It provides a better process for making learned decisions. AHP encourages decomposing a larger problem into related problems such that it forms a hierarchical structure and related sub-problems are grouped together. This allows analysis to be more structured and independent. Key advantage of using AHP include firm problem statement and consideration of different options along with associated weights. Overall this results in better understanding and evaluation of options or gaps. AHP process is defined in details at http://en.wikipedia.org/wiki/Analytic_Hierarchy_Process.
One thing to note with AHP is concerns on the rank reversal approach of AHP; however this works better in IT industry in most of the scenarios. For example consider security vs. performance, its known that when security is heavy performance may degrade, just one of the many cases.  
In general AHP implementation would require 3 steps.
1.       Identifying importance of area and rank it against other areas. This is referred as ranking.
2.       Identifying relevant importance of existing implementation with desired implementation.
3.       Reporting gaps or finding that would enable better decision making.
I will try an explain each of these in the following sections

1.         Identifying importance of area and rank it against other areas

AHP rank concept is depicted below using sample areas (performance, availability, scalability, security, accessibility and Usability) as different areas of concern.
Green colored cells are the ones you modify based on requirement. Cells with Orange color are always equal as its comparison against the same attribute. Yellow colored cells are computed using the rank reversal explained above. Note that the importance is horizontal wise (it can be vertical also) for definition but has impact on the corresponding cell in vertical. For example performance and availability is High (highlighted using red ellipse), resulting availability and performance low. This means compromising in availability may also result in compromised performance. A detailed mathematical process on refining the ranks further to have higher accuracy can be found at http://www.youtube.com/user/BPMSG#p/u/0/18GWVtVAAzs.

Identifying relevant importance of existing implementation with desired implementation

This is done by mapping existing implementation with expected implementation. Consider following example for security. Transport layer implementation is not matching the expected implementation.


Expected Implementation
Actual Implementation
Authentication
Active Director (5 points)
Active Directory (5 points)
Authorization
Custom (5 point)
Custom (5 points)
Transport
SSL (5 points)
Basic (3 points)


A relative point system can be used to weigh the implementation. It is more relative in nature and does not have fixed point system allowing each organization to device own point system. In the above scenario SSL has 5 points and Basic has 3 points. There can be more options like none with 1 point, hybrid with 4 points, etc. to communicate gaps.

Reporting gaps or decision making findings

Reporting can performed in different ways; one of the ways to highlight the gaps is as below:
The report indicates the stakeholder concerns are well addressed in the areas of Performance, Security and Availability compared to Scalability, Accessibility and Usability.
Note that at this point only the gap is highlighted. A more details inspection is required to identify the areas that are resulting in gap and fixing them. It would require a detailed questionnaire and subject matter experts to arrive at corrective actions.