Business Analysts Handbook
Advertisement

A history of business analysis from the very beginning


Pre-History to 1890's[]

.' The earliest aides for calculations started with the use of marked sticks, going back to at least 35,000 BC and progressed with the invention of newer tools such as the abacus, the concept of zero and then that of a fraction, followed by algebra, logarithms, and calculus, as well as various mechanical, water, textile looms and steam devices. Up until the late 1800's, such devices were mainly used for astrology, accounting and taxes, inventories, astronomy, navigation, reproducible pattern weaving and clocks.[]

As far back as Aristotle, the division and departmentalizing of labour has been something society and business has been putting down on paper proposed theories and solutions for. These theories and solutions are important to the BA, as the BA is often responsible for mapping processes and reengineering or modifying them.

In 1776, Adam Smith published the first example of a business process, the production of a pin. He showed that by identifying the steps in a process, you could use division of labour in the application of the process assigning specialists to each step and hence improving the quality of the product and the speed of its production.

Then in 1890, the US Census was undertaken. If the census had been collated in the same manner as the smaller 1880 US Census which had taken 7 years to complete, the 1890 US Census would have taken more than 10 years to complete, finishing after the next census was due, because of the increase in the population size. So a device of electro-mechanical relays was developed by Herman Hollerith that used punch cards as a means of memory, with the punch card concept taken from the textile industry and their use in looms. The collation of the data was completed in 6 weeks, though the exercise ended up costing almost double that of the 1880 US census. It was also the first large scale census that attempted to gain some profile on those being counted, instead of just age, location and sex, allowing greater in depth analysis of the data. The 1890 census was also the genesis of what was to become IBM.

The late 1800's saw the development of the concept of business management as a scientific discipline, with the works of Henry R. Towne and Federick Winslow Taylor with the focus on mapping out processes, planning the optimum work approach, training the workforce in the optimum approach, and managing the workforce to adhere to the plan, removing amateurish heuristic approaches to work and being active in training rather than leaving it to be passively learnt on the job. Taylor coined the phase scientific management, often referred to as Taylorism or the Classical Perspective. Henry Ford was heavily influenced by Taylors work, developing what has come to be known as Fordism, coupling vertical integration and assembly line manufacturing, along with higher salaries and decreased working hours. Lenin and Stalin were both impressed with Taylors work and Henry Ford's application of this approach and attempted to apply it in the early years of the Soviet Union. Taylor's work combined with that of Frank Gilbreth and Lillian Gilbreth led to the creation of what was known as time and motion study.

1900 to 1930's[]

From 1900 to the 1930's, physics, pure and applied mathematics, chemistry and electronics, all contributed to a vast number of significant advances in understanding and inventions that led to the invention of radio, radar and television, as well as the birth of binary computers and by 1939, the first vacuum tube computer was created at Iowa State University in the US. A common use of computing aides through this period was for the calculation of trajectories for shells fired from artillery and the trajectories of bombs dropped from planes.

Vilfredo Pareto, an economist, made the observation in 1906, that 20% of the population owned 80% of the property in Italy. He then compared this observation to other countries and found the results to be similar. This was later generalised as the Pareto principle. It is often applied in business analysis, to focus on the 20% of functionality that will provide 80% of the returns, and software engineering, where 80% of defects are often found in 20% of the code, where the percentages are more figurative than literal. Essentially, most of the gains to be made with any work effort can be done by addressing only a small proportion of the problem space, beyond which you begin to run into the law of diminishing returns. The challenge is to identify where the best returns can be obtained with least effort. A variation to this principle used in software engineering is the 90-90 rule, attributed to Tom Cargill of Bell Labs in 1985. The aphorism states: "The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time."

Henry Gantt in the release of his Gantt charts in 1910 is acknowledged as being heavily influenced by Taylors work. The first such chart was actually created by Karol Adamiecki in 1896, but Adamiecki first published them in 1931 as Harmonogram or Harmonograf, so today we refer to them as Gantt charts.

In 1926, Henry Ford introduced the concept of Just In Time, describing it as "dock to factory floor". As its name implies, it is a process where stock is ordered only as required, with little stock held, thus reducing storage costs but potentially increasing purchasing costs and transport costs. It also increases the risk to the manufacture if anything goes wrong with the supplier or the transport link.

Henri Foyle in 1917 released the Administration industrielle et générale which was translated into English in 1949 under the title of General and Industrial Management. Whereas Taylor viewed the management processes from the bottom up, Foyle viewed them from the top down, however, Foyle had been influenced by the French translation of Taylor's work that was applied through the government factories during World War 1. The focus of Foyles work was on management of people and teams, reporting lines and control based on his work in management in coal mines. There are many parallels to military structures. The term came to be known as span of control, taken from the military. At the time it was usual to have one manager average ten employees.

Gulick and Urwick in 1937 proposed a couple of strategies for departmentalising. Essentially, the division of labour creates specialists who need management, and the activities that these specialists perform can be organised by either functions performed, product line, customer profile, geography or territory, or product or service or customer flows. The choice and efficiency of how an organisation is structured is dependent on whether the organisation is stable or needing to react to change.

1940's[]

Through the 1940's, increasing resources, with the aid and development of newer computing devices, were used on cryptology, aerodynamics, trajectories, meteorology, and physics, the latter leading to the splitting of the atom. This period saw the first programmable computers being developed. In the latter half of the 1940's, computers advanced with the inventions of electronic memory storage, higher level computing languages and the ability to modify a program during its execution. By the end of the 1940's, almost all computers were using valves.

During World War II, the British government in an attempt to address occupational accidents such as munitions going off in factories introduced a management standard BS 5750. The standard did not dictate what or how something was manufactured, but what processes such as records and reports that needed to accompany the manufacturing process, in an ability to mature the management processes.

The late 1940's also saw the introduction of Cybernetics and Information Theory. Two important concepts for understanding how systems do work, identifying issues in systems, and determining how systems should work. The application of these fields, especially Information Theory, are applicable across a large range of disciplines.

1950's[]

1951 saw the first business computer used by Lyons Tea for processing of their payroll. By the mid 1950's, there were estimated to be in excess of 100 computers in the world. Mostly as research platforms in universities or to support governmental needs such as census counting and as part of defence early warning systems in response to the cold war. The late 1950's saw the release of FORTRAN, the brainchild of John W. Backus, by IBM, which was the first high level language and LISP, another high level language favoured by Artificial Intelligence programmers, developed by John McCarthy while at the Massachusetts Institute of Technology in 1958. IBM released the first dot matrix printer and Texas Instruments invented Integrated Circuits.

In the same year, Armand Feigenbaum released a book in which the key concept was Total Quality Management (TQM). Though the term was first used in a published book in 1961, which in turn was a phase taken from the US Naval Air System Command. TQM is random sampling of products using statistical methods, and testing the sampled products until failures are found. More commonly used in manufacturing. The use of TQM reached its peak in popularity between 1992 and 1996.

In 1954, Peter Drucker introduced the notion of span of managerial responsibility. This was a modification of the span of control, whereby instead of micromanagement by managers, managers were encouraged to delegate responsibilities to their employees.

Booz Allen Hamilton, Inc under contract with the US Army of Defence in 1958 as part of the work on the Polaris submarine project, invented the Program Evaluation and Review Technique (PERT) chart as a model to assist management, a revival of scientific management.

The Algorithmic Language (ALGOL) was released in 1958 as a result of collaboration between U.S. and European researchers. While it never made much impact commercially, it did become the de-facto standard for the next 30 years for describing software algorithms, and served as the basis of many computing languages that were to follow.

1960's[]

The term software engineering started being used in the late 1950's and early 1960's, beginning the conceptualisation of software development as an engineering discipline. It is the youngest of the engineering disciplines. Applying the label of engineering to software development has led to much debate in how applicable this is to a discipline that is partially logic based, and partially creative, with each software project a new invention, always containing a degree of the unknown, but often based on common principles. By being treated as an engineering discipline, there have been many attempts to apply traditional manufacturing processes to the development of software projects with limited success. Some see it as more a craft than a science. Others see the field as too immature, with major changes every decade, procedural programming, structured programming, component orientated programming, object orientated programming, etc. David Parnas, the father of Object Oriented programming, has been at the forefront of promoting the professionalisation and use of the term Software Engineer, stating that Software Engineering is a form of Engineering (1998 - Software Engineering Programmes are not Computer Science Programmes). Steve McConnell, a member of the IEEE, and also of Code Complete fame, and regarded as one of the most influential individuals in the software industry, has stated that Software Engineering is not a form of Engineering, but that it should be (2003 - Professional Software Development: Shorter Schedules, Better Projects, Superior Products, Enhanced Careers). And Donald Knuth of The Art of Computer Programming, as well as the TeX typesetting fame, states that programming is an art (1974 - Computer Programming as an Art)

Additionally, most processes are sold as being best practice as a pseudonym for latest fad. Unfortunately, there is no way of verifying what is best practice in say the medical sense. Due to the cost of development, the variability of problems, the range of potentially acceptable solutions and the range of capabilities of team members, and the corresponding uniqueness of each project, you cannot submit a process to double blind trials, or repeat a number of processes multiple times for the same solution and say that one process is categorically better than another or one computing language or tool, or whatever is being sold is necessarily any better than a competitor's. Even the frequent claims of being the best tool or process to reduce the Total Cost of Ownership (TCO) more than any other, is very difficult to prove.

Computers based on transistors are regarded as second generation computers, with the first transistorised computer appearing in 1953, though the era usually refers to computers created between 1959 to 1964. This period saw an explosion of computer languages being released, including COBOL in 1961. Simula was one of the earliest computing languages that could be considered object oriented. Large business organisations were beginning to adopt computers, primarily to perform their payrolls. The first computer game was written in 1962 and the mouse invented in 1963. Computers based on integrated circuits are regarded as third generation computers, with the era covering computers created between 1964 and 1972. Interactive computing with a graphical interface, mouse, full screen word processing and hypertext was demonstrated in 1968. In 1969, ARPANET, the forerunner of the current Internet was created by the US Department of Defense as research into networking and as a way of hardening their defensive capabilities through redundancy in the event of a nuclear attack.

Due to cost overruns, failed projects, property damage and deaths caused by software defects and no ready solution to address these issues, the term Software Crises was coined. It has been a long running crises and of late there has been a reluctant acceptance that developing software is difficult and that while incremental improvements are being made to improve in these areas, the complexity of requested solutions is also increasing, negating these improvements. Therefore it is not a Software Crises but the state of software development, though there are still voices that say that there is no need for it to be so. As a general rule of thumb, the complexity of the solutions increases in tandem with the power of the computers and the expectations of the user base increases with the IT literacy of the user base. The limitations of a computer's power and the development tools and processes available as well as the maturity of the process and capabilities of the developers tends to constrain the complexity of a solution (as opposed to the complexity of the implementation), while the increasing IT literacy and expectations of the user base tends to push the complexity of the solution.

The Software Crises has led to every new marketing hype through to the 1990's and to a lesser extent beyond, being sold as the silver bullet to address this Crises. Again, a recent reluctant acceptance has been developing that there is no single silver bullet that will address the issues faced in developing software. However, most of the processes and some of the tools of the last 50 years are still around and have been developed as a result of attempting to address the Software Crises. This helps provide some insight for the BA in why there is no common standard process for software development or process engineering. Unfortunately, many organisations and managers are still guilty for searching for that silver bullet and clinging to the vendors sales pitch, or prone to jumping from one industry fad to another, ignoring what their basic principles are. It is a trap that all people associated with the IT industry should be wary of falling into.

1970's[]

Fourth generation computers based on Large Scale Integration (LSI) circuits refer to computers created from 1972 to the present. In 1970, IBM released the first RAM chip of 128 bytes. 1971 saw the first portable electronic calculator released by Texas Instruments. Niklaus Wirth in 1970 released the Pascal computing language. This was based on the ALGOL language of the 1950's. In 1972, ARPANET went international with a link across the Atlantic. With the release of the first microprocessor by Intel in 1972 and Ethernet in 1973, the 1970's saw a range of small household personal computers being released to the public and a proliferation of Local Area Networks (LAN) to connect people's computers. The Altair, Apple I, Z80, Commodore, Apple II, Tandy TRS-80, Apple Macintosh, etc. were all released over a period of a few years. The 1970's saw the release of the C programming language by Dennis Ritchie which led to the development of Unix, and the Basic programming language along with the founding of Microsoft to market it.

The distribution of ubiquitous cheap flexible computers and the corresponding increased market and demand for accompanying software outside of the constrained environment of large business, university and government mainframes, led to a corresponding demand for increasing numbers of managers and software developers, which was satisfied by less experienced and capable professionals than previously existed in the industry, which led to a worsening of the Software Crises.

As a way of addressing the Software Crises, in 1970, Winston Royce published an article covering both a waterfall process approach as the way not to do software development and an iterative process approach as the way to do software development. Unfortunately, most readers of Royce's article only took on the waterfall process and ignored the flaws that Royce had identified.

In 1972, David Parnas released his seminal paper On The Criteria To Be Used in Decomposing Systems into Modules. In this paper he introduced the concept of information hiding as an approach to designing software systems, which later terms, cohesion and coupling, were used to describe.

By the mid 1970's, the concept of information economies began to emerge. Increasing value was placed on the generation of useful information from raw data. This in turn led to the emergence of data warehouses and Management Information Systems (MIS) giving rise to Information Management (IM). Coupled with these systems, manual workflow processes were developed to manage the movement of documents through organisations, which over the following decades transitioned across to IT platforms with the inclusion of document management and imaging systems.

Chuck Morris of IBM introduced Joint Application Development (JAD) in 1978. It is a process for defining requirements with a multi-disciplinary team via managed workshops.

Philip B Crosby in 1979 released a Quality Management Maturity Grid (QMMG) in his Quality is Free book, for business and organisations to assess the maturity of their software development processes.

1980's[]

The 1980's saw the release of graphical computing platforms such as Apple's Lisa in 1983 and Microsoft Windows in 1985. 1989 saw the invention of the World Wide Web - combining hypertext and networking.

By the 1980's, the concept of an Information Revolution or an Information Age was being used to describe the use of computers and information. A key role for a business analyst is to understand what data can be used to extract information, and the value of that information. They also need to be able to understand how to use that information to drive further process change to improve returns to the business or to initiate new initiatives.

The management manuals of the Toyota Production System (TPS) were roughly translated into English in 1980. This introduced the term Lean Manufacturing. Though it wasn't until the 1990's that the term came into wide use with the publication of the best seller The Machine That Changed the World : The Story of Lean Production. As part of the rebuilding of Japan, Taiichi Ohno an engineer at Toyota visited the US and took inspiration from the writings of Henry Ford and Federick Taylor, the principles put in place by W. Edwards Deming who was involved in the Japanese reconstruction, and an American supermarket just in time inventory system. The focus of Lean Manufacturing is on reducing waste, whether that be idle time or material. Both Scrum and eXtreme Programming have taken inspiration from Lean Manufacturing principles.

In 1980, the Central Computer and Telecommunications Agency (CCTA), a UK government office, released the Structured System Analysis and Design Method (SSADM). SSADM is a waterfall methodology, seen as the pinnacle of the document focused, requirements gathering, up front design and estimation led approach to software development.

Eliyahu Goldratt introduced the theory of constraints for systems management in his book The Goal, published in 1984. Rather than basing accounts simply on inputs and outputs, Goldratt suggests that you need to make use of throughput accounting as well, and identify where your bottlenecks are to obtain a cost vs. throughput, so that you understand the cost and reward of addressing the identified bottlenecks.

Bill Smith of Motorola pioneered Six Sigma in 1986. It developed as a means of keeping factory defects within a certain tolerance and has become a primary element of many organisations' Total Quality Management (TQM) initiatives. Its applicability and use as a process for managing software development to control quality has been seriously questioned.

The British government in 1987, persuaded the International Standards Organisation (ISO) to adopt the standard BS 5750 as an International Standard, which had been in place in Britain since World War II as ISO 9000. It was also influenced by existing US and other defence military standards. The emphasis was a conformance to procedures rather than the process of management. ISO 9001, ISO 9002, and ISO 9003 (which have subsequently been combined into ISO 9001) apply to the quality assurance of software development processes.

Barry Boehm, in 1988 published his Spiral Model as an iterative approach to software development. The approach is based on the iterative refinement of prototypes with iterations being anything from 6 months to 2 years.

In 1988, ISO/IEC 12207 Software Lifecycle Processes was proposed, which was released in 1995. It covers the development, support and organisation lifecycles of a software product or service. It has been developed as a joint exercise of both the ISO and the International Electrotechnical Committee (IEC)

In 1989 Howard Dresner popularised the term Business Intelligence (BI), used to describe the capture, interpretation and use of information from which to make decisions. Generally, gaining the information to make a decision can be more costly than the value in the decision itself. There has also been work done on reducing this cost of gaining information meaningful to a decision.

In the same year Watts Humphrey released his Managing the Software Process book, detailing the Capability Maturity Model (CMM), which had been developed by the US Dept. of Defense Software Engineering Institute (SEI) from 1986. Based on Crosby's QMMG, the CMM was originally intended as a tool to assess the software development maturity of potential defence contractors, but subsequently evolved into a model for software process improvement.

Also in 1989, the (CCTA) released the information systems project management process PRINCE as a UK government standard, which soon became regularly applied outside of an IT environment.

Bill Moggridge first proposed Interactive Design in the late 1980's. Initially the field was called SoftFace. The discipline of defining and creating how systems behave and interact, where a system could be anything from people, computers, clothing, to organisations, looking at behaviour as well as form, assessing the usability and emotional factors of the system being studied.

With the flattening of organisation structures through the 1980's, the span of control changed significantly from an average of 1 manager to 10 employees, to 1 manager to 100 employees.

1990's[]

1991 saw the release of Linux and the release of the Pretty Good Privacy encryption program. 1993 saw the release of NT 3.1 and the year heralded the explosive growth of the Internet with the release of the graphical web browser, NCSA Mosaic and Netscape Navigator the year after. Java was released by Sun in 1995 and in the same year Microsoft released Windows 95 which under the covers was MS-DOS 7.0. Yahoo! and Altavista search engines where also launched in 1995, followed by Google in 1998. The mid to late 1990's was the era of chasing after the next killer application. The chase for the next big thing led to over inflated stock prices for many IT and web based businesses, often without any sound business case. The undercurrent of hysteria around addressing any Y2K issues also led to inflated IT salaries. Following the New Year of 2000 with no significant issues caused by the date change even in those countries who largely ignored the issue, the result was somewhat anti-climatic. Then the dot.com bubble burst in March 2000.

In 1990, Michael Hammer accused the application of technology and IT in particular in automating existing work rather than identifying and obsoleting non-value work. Out of this emerged what has become known as Business Process Reengineering, which many draw parallels with Taylor's principles of scientific management, and its pure application led to large numbers being retrenched and very large changes over very short periods for a large number of organisations as a means of staying competitive which happened to coincide with the latter part of the 1988 to 1992 recession of the Western stock markets.

James Martin from IBM in 1991 released a book detailing the approach he had developed called Rapid Application Development (RAD) as a response to the monolithic approaches of software methodologies such as the waterfall process, and the CMM and ISO standards, RAD was developed to address the issue of requirements changing before the software was complete, often resulting in delivered products no longer fit for purpose. Inspiration was taken from the likes of Boehm, and was an attempt to create an agile approach to software development.

The German federal government in 1992 released the V-Model process for software development, having an emphasis on traceability of requirements through the different phases of the development life cycle and assigning ownership to the various steps. It is now in wide use through a large number of organisations.

ISO/IEC 15504, SPICE was proposed in 1993. Based on a combination of ISO 12207 and CMM, it is a framework for the assessment of software processes. Whereas in CMM, your highest rating is your lowest matrix score, in SPICE your assessment is as a matrix, so that you can publish both your strengths and weaknesses. SPICE initially stood for Software Process Improvement and Capability Evaluation. However, it now stands for Software Process Improvement and Capability Determination.

In 1995, the Dynamic Systems Development Method (DSDM) was published. Based on Martin's RAD, and created by a consortium of vendors and experts in the field of Information System (IS) development, its goals are to produce a framework that will deliver software meeting the customer's requirements on time and on budget. Also out of DSDM came MoSCoW, a methodology used to prioritise customer requirements into must have, should have, could have, won't have now but will later.

Jeff Sutherland and Ken Schwaber presented a joint paper outlining their experience with Scrum in 1996. They had originally implemented scrum practices independently, based on work originally published by Takeuchi and Nonaka in 1986 and referred to in a 1991 publication by DeGrace and Stahl. Takeuchi and Nonaka noted that small, cross functional teams tend to be the best performers. Scrum is an agile methodology which is not restricted to software development in its applications.

PRINCE2 was released in 1996 as a generic project management methodology, evolving from the 1980's PRINCE process. It has since become the de-facto standard for project management in the UK.

Donella Meadows in 1997 published her Twelve Leverage Points, in which she stated the observation that all systems have leverage points where a small shift can cause large changes, and while we usually know where these leverage points are instinctively, we usually apply the force in the wrong direction.

The three Amigo's, Grady Booch, James Rumbaugh, and Ivar Jacobson, working for Rational Software Corporation, released the Unified Modeling Language (UML) version 1.0 draft in 1997. Until this point, there had been a plethora of modeling notation languages around, but UML quickly became the industries de facto standard.

In 1998, Rational Software Corporation released the initial version of the Rational Unified Process (RUP), version 5.0. Rational was subsequently purchased by IBM in 2002. RUP is often seen as a megalithic process similar to the impositions of CMM and ISO compliance. However, RUP is meant to be customised individually to each project with the configuration process choosing what RUP process modules are appropriate and applying them. In its most basic form, RUP can be applied as an agile methodology.

Kent Beck, Ward Cunningham, and Ron Jeffries developed eXtreme Programming (XP) while working on a payroll project with Chrysler in 1996. In 1999, Kent Beck released his book Extreme Programming Explained. XP is the most widely known of the Agile software development processes. While XP does not introduce anything new, taking from practices going back decades, such as JAD, Test Driven Development (TDD), pair programming, continuous builds, keeping feedback loops as small as possible, customer involvement, refactoring etc. it is a unique combination of these practices and an ideology that makes it special. One of the ironies of the more extreme religious proponents of XP, is that while they acknowledge that XP is not a silver bullet, when examples of XP failures are pointed to, they often say that it must have been due to the XP process not being fully followed. Yet they overlook the fact that the Chrysler Comprehensive Compensation system (C3) on which XP was pioneered was canceled in early 2000, having been essentially unsuccessful.

Feature Driven Development (FDD), devised by Jeff De Luca on a project with Singapore Bank in 1997. The approach was first introduced to the world in one chapter in a book by Peter Coad in 1999. FDD is a model driven short iteration agile development process.

2000's[]

Since the dot.com crisis, many larger organisations in the Western nations have offshored and outsourced much of their IT. Some companies are beginning to take it further by doing so in multiple countries to take advantage of development and support around the clock and to insure that any local crises will not unduly affect them. This has also led to a large number of people from lower cost countries being sponsored by companies to work for short durations in the country that the company is geographically located.

As a business analyst, knowledge, empathy and tolerance of other cultures and nationalities is very important, but it is also a difficult challenging aspect of modern large corporations. An important skill is to be able to use and communicate effectively using a range of tools - email, white boards, messaging, phone, conference calls, video calls, shared applications, etc. It is also beneficial to be aware of the potential sensitivities and frictions that may exist in such environments. Another consideration in distributed teams is the increased management overhead to address the negative impact to communications.

The early 2000's saw the emergence of Business Process Management systems - growing out of workflow systems. The associated notation languages are still evolving. The emergence of these systems and tools led to the development of what has come to be known as Business Process Management (BPM) which is about continuous evolutionary change in line with Non-Linear Management as distinct from the large once off changes of Michael Hammer's Business Process Reengineering, an example of Linear Management.

The Software Engineering Institute (SEI) released the Capability Maturity Model Integration (CMMI) in 2001, as a successor to CMM in an effort to improve the usability of maturity models for software development and other disciplines. It has also incorporated many of the ideas of SPICE.

In 2001, 17 people, mostly XP practitioners got together due to their interest in "lightweight methods" of programming. They released the Agile manifesto and formed the Agile Alliance to represent the various lightweight methods under the one umbrella, Agile.

The Sarbanes-Oxley Act of 2002 (Public Company Accounting Reform and Investor Protection Act of 2002), also known as SOX, was introduced to address a range of corporate and accounting scandals in the US. It acknowledges the role that IT plays in maintaining the security, authenticity and accountability of a company's accounts, and has had a big impact on the processes around developing software and managing production data. Any publicly listed company trading on the US stock exchanges need to be compliant with this act. As a business analyst, it is important that you are at the very least familiar with what the essentials of the act are and what the implications on the company's processes and IT departments are. There are some debates on whether agile methodologies are compliant with SOX.

There has also been a move away from specialist division of labour with a corresponding impact to the traditional span of control. Teams are less hierarchical and more cross functional. There is an increasing tendency to organise teams based on vertical capabilities rather than horizontal. So rather than a team of BA's, a team of developers and a team of testers, a team is made up of BA's, developers and testers, and other members from other departments and the team is responsible for the whole life cycle of a functional piece of work. The members of these teams are managed using matrix management, first introduced in the early 1970's to manage a pool of resources who cycle across projects.

Advertisement