Ever wondered how much data is generated by the Software systems day in day out ? Answer could be enormous !
Let's take an example of software patches every week.
Let's consider an example of software development life cycle which comprises of analyzing the tool to pick for software development, build tasks, resolving compilation errors, resolving code compatibility issues, resolving the defects, resolve security vulnerabilities, deploy to production, address defects that surface in production systems due to platform dependencies etc. All these stages and phases generate data !
So, we are talking about huge structured data in terms of relationship of the assets, versions, known software defects, compatibility issues, cost associated to build and operate etc.
Also, there is a lot of unstructured data in terms of system performance logs, system patching exercise, system upgrade tasks, system failover tasks etc.
This is the big data we all need to understand, be aware of so we know the dependencies, impacts to already running software and age old systems in the process of upgrades.
Please comment and share your experiences with legacy software applications, their build and operate errors you have experienced, would love to hear and let us learn together in this community.
For further actions, you may consider blocking this person and/or reporting abuse
Top comments (0)