DEV Community

tonipetov for Software AG Tech Community

Posted on • Originally published at on

Analysis of business-critical core applications

Adabas & Natural 2050

*Striking a balance between cutting costs and optimally supporting the business model is a major challenge for many IT departments. They also need to calculate how to adapt the amount of IT support—based on the needs of the business model—without raising the complexity of their IT infrastructure nor allowing application landscapes to grow out of control. From conducting a comprehensive analysis to establishing key decision parameters, this article covers the journey to a successful core application transformation geared to the needs of the business model. *

Core applications in a changing IT landscape

Every large and evolving IT application landscape has extensive business-critical applications that have been developed individually during the span of many years. Their ability to withstand the future must be put to the test. These customized core applications may implement core processes or manage huge streams of financial data at organizations in the public and private sector. Any failure on their part would cause colossal financial damage to the company and/or irreparably harm its image.

It is up to IT decision makers to determine the future of these applications: Should they be replaced, re-implemented or modernized? Errors or insufficient consideration of certain aspects (e.g., complexity, expertise, transparency, governance) during this decision phase will lead to failed projects (“money pits”) or missed deadlines and budgets without adding value to the business or IT.

In order to mitigate risks, maximize potential benefits and ensure project success, an extensive core application analysis must be incorporated into the decision-making process. The analysis must consider technology and business functionality of the core application as well as the knowledge of all associated stakeholders, such as the business unit, users, programmers and IT operations.

The ultimate goal is a gradual, transparent and controllable application transformation that addresses the current and future requirements of a growing digital world without jeopardizing the intrinsic value or operation of existing applications.

The need for comprehensive analysis

How can a transformation be successful if the dependencies between systems are not clear, if the modules and elements aren’t transparent, and interfaces are poorly maintained and only minimally documented? How can a transformation take place if you can’t tell which modules and components are used in a business process and how? What knowledge of applications will be lost when staff members soon retire? How do you realistically calculate the value of factors such as cost, time and risk given these conditions?

A comprehensive analysis must address these issues in order to establish a solid information base for decisions to be made in order to evolve applications into a future-proof architecture in a controlled way.

_ Figure 1: Core Application Analysis and Transformation_

The analysis of an application must involve all relevant organizational entities, such as the business unit and Research & Development. The analysis must encompass the business and IT contexts of the core application, for example, industry-specific methods, business processes, software products, IT infrastructures and IT architectures.

Only a complete analysis of business and IT contexts will enable you to make a thorough evaluation of your core application to make decisions with calculated risks. In addition to application details, the dependencies of system components and systems in the development and runtime environments (e.g., programming languages and environments, database systems, middleware, job control) as well as IT infrastructure (e.g., operating systems, hardware, devices) must be transparent.

Static aspects (e.g., application structure, source code, interfaces) as well as dynamic criteria (e.g., runtime behavior) must be considered. All relevant business and IT factors must be documented consistently and in relation to each other. All stakeholders should have access to the results of the analysis to view and to use for collaborative decisions.

_ Figure 2: Core Application_

The complexity of core applications

Business-critical applications, whether dialog or batch, are highly complex. This is not only reflected in the source code but also in the respective business and database transaction logic. There are also multiple interfaces involved that tightly link internal and external IT systems, servicing a variety of business processes and user groups.

Core applications run primarily on mainframe platforms or highly scalable server platforms and employ technologies such as Natural, COBOL, Adabas, DB2® or VSAM™. Because these systems have been optimized over many years, they achieve a high level of operational quality that is apparent through their fulfillment of rigorous Service-Level Agreements (SLAs).

Total transparency and complete knowledge of these linked applications is at risk due to factors such as employee retirement and the ensuing generational transition. Up-to-date, consistent documentation of IT implementation and the related business functionality are often not available.

_ Figure 3: Complexity of Core Applications_

An integrated analysis platform

In order to document all context-related elements, establish their relationships and share them with various stakeholders, you need an integrated, collaborative analysis platform. That is where information on business functionality is defined in relation to core applications and overriding elements of the IT portfolio and enterprise architecture.

This gives all stakeholders a comprehensive, consistent and central knowledge base for inquiries and decisions.

Core application analysis

Core application analysis must consider structural (static) and runtime (dynamic) aspects in order to establish the level of complexity and criticality with regard to use of the application.

_ Figure 4: Integrated Analysis Platform_

Static application analysis

Business and process logic are implemented by source code and the associated program structures and libraries. However, source code often contains more “spaghetti code” than structure, which makes it difficult to recognize and change the business logic. Natural Engineer, a tool for static application (e.g., program and data structures) analysis, accurately maps the current situation. Various assessment options can make the complexity of static application analysis more manageable.

_ Figure 5: Static Application Analysis_

An ideal analysis platform, like Natural Engineer, offers these capabilities:

  • Analysis tools for COBOL, IBM® CICS® tables, JCL and Natural source code
  • Support for Adabas & Natural, COBOL and other 3GL languages and features, such as various programming types like subprograms and copybooks and different product versions
  • Interface recognition and documentation
  • Automatic generation of application documentation, structure diagrams (e.g., control flow, decision tables), reports and impact analysis in multiple formats (e.g., Microsoft® Excel®, Microsoft® Word, PDF and HTML)
  • Recognition of databases (e.g., Adabas, DB2®), database structures and access types
  • Complexity metrics (e.g., McCabe, Halstead)
  • Identification of obsolete or redundant source code
  • Web-based tools for interactive navigation of program structures and dependencies

Dynamic analysis of processes

Dynamic analysis evaluates an application’s runtime transaction load, database access, user interaction and service calls during normal operation and peak load times. It provides information on the application’s number of users during a specific window in time and the efficiency with which it handles peak loads.

Entire Operations assesses batch job operation and whole batch job networks. The processing statuses of all jobs are examined to detect and report any SLA breaches and better understand dependencies.

_ Figure 6: Dynamic Analysis of Dialog and Batch Applications_

An ideal analysis platform, like Entire Operations, offers these capabilities for dynamic analysis:

  • Dialog and batch application support
  • Source code diagnostics when executed (profiling and code coverage)
  • Production status monitoring across distributed and heterogeneous platforms
  • Detection of critical situations through regular monitoring of system KPIs
  • Measurement and visualization of completed batch processes and ascertainment of process differences (target vs. actual)
  • Graphical dashboards for fast and easy evaluation

Analysis of business functionality

The primary purpose of analyzing the core application’s business functionality is to establish the relationship between business and technical elements in order to determine which parts of the application implements which business rules and/or processes. Only these correlations can provide a full picture of the application, allowing a fact-based exchange between employees from business units and the IT department.

The interface between the static analysis tool Natural Engineer and ARIS for business functionality analysis provides knowledge about applications, programs and relationships in an easy-to-understand fashion for non-technical staff members. It serves as a basis for documentation of business processes and recommendations for improvements and enhancements.

_ Figure 7: Analysis of Core Application Business Functionality_

An ideal analysis with the Natural Engineer ARIS Interface and the ARIS platform offers these capabilities:

  • Pairs business processes and rules with implementation components
  • Translation of technical program logic into an additional documentation format, such as Business Process Modeling Notation (BPMN™), that can be easily understood by and meaningful to business units
  • Matches batch data with business-related events (business logic)
  • Аnalyzes business processes and its implementation for optimization
  • Promotes collaboration between business and technology departments

Analysis of the IT portfolio

The results of the analyses of the core applications and their business functionality will lay the foundation for the overall inventory and analysis of the IT portfolio. The IT portfolio analysis should map out and evaluate business functionality, applications, technologies, strategy, requirements, projects and their complex relationships to each other with the aim of optimizing the application landscape and aligning it with the business objectives.

The ideal landscape uses an IT portfolio management platform like Software AG’s Alfabet and includes plans for implementation and compliance governance. It is particularly important here to have clear roles and responsibilities when it comes to gathering information, assessing the portfolio and making portfolio-related decisions. IT portfolio-management stakeholders need reliable information about applications and their context. They must also have access to role-specific content, views, functions and workflows.

_ Figure 8: Relationship between Business Functionality, IT Portfolio and Core Application_

During the analysis process, the IT portfolio management platform Alfabet enables all stakeholders to:

  • Link business functionality, applications, technologies, projects, strategy, requirements, projects and costs
  • Define standards for documenting and evaluating projects such as cost/benefit analyses
  • Utilize the results of application and business functionality analyses to optimize and align the application landscape with the business strategy
  • Assess how the application portfolio is impacted when business functionality or technology changes and vice versa
  • Create and monitor compliance with a development plan for the application landscape
  • Integrate existing IT repositories and their assets
  • Configure reporting and analytics functionality
  • Define and configure role-specific accesses and workflows

Planning and simulating a strategic IT transformation

The results of an analysis of applications, business functionality and the entire IT portfolio are the starting point for making decisions that will facilitate a reliable evolution plan to transform your current core application architecture into a future-proof, target architecture. An effective analysis allows the relevant departments to continuously monitor and manage the transformation process as needed. Parallel operation of current and target applications during the implementation period means elements of the architecture that are modified or added can be evaluated early on, reducing risks as well as implementation time significantly.

_ Figure 9: Evolutionary Transformation of Core Applications_

Your path to success

A successful transformation requires comprehensive, relevant analysis results. The analysis should be executed in three steps:

  1. Analyze core applications taking into account static and dynamic methods for online and batch processing
  2. Analyze business functionality taking into account the results of step one
  3. Analyze the IT portfolio taking into account the results of steps one and two

If necessary, sub-steps can be carried out simultaneously, which can help you arrive at insightful results quickly.

Top comments (1)