The internet, World Wide Web (WWW) or simply the web, is the most revolutionary information technology and application development platform in the world. The present world economy majorly depends on the web.
The simplest way to conceptualize it is that it is a network of interconnected nodes called servers and clients, exchanging data over well-defined protocols like TCP/IP, HTTP, FTP etc. Read more about its inception and specifications.
When building for the web as per today’s industry standards, it is important to have an updated and relevant stack of tools, with an architecture that supports CI/CD and QA at all levels.
Structure of web applications
A web application or website is a package loaded on the client-side containing multiple layers of interdependent modules, built on the core web technologies :
- HTML: Hypertext markup language or HTML, is the tag-based document marking system which defines the structure and individual components of, what is compiled into the Document Object Model or DOM.
- CSS: Cascading style sheet or CSS is the native style description framework which is used to identify and style different parts of the DOM into visible area of the page. It provides features like element selection by id, class and relation to other DOM elements.
- JavaScript: JS is a high-level interpreted scripting language upon which all the dynamic behavior of the application is scripted and executed.
These core technologies are augmented with layers of other technologies to increase the functional capability. From popular JavaScript frameworks like Angular, React and Vue, to CSS pre/post processors like Less, Sass and also other HTML templating engines, the domain of technologies related to web is truly vast.
Besides the front-end layer, in most applications, there is also a backend or server-side layer with APIs built on micro-services and Databases, which hold all the data, business logic and abstract all the information into relevant contracts that the front-end can access via HTTP methods, with proper request and credentials.
Web Application Architecture Diagram
Depending on the tools used, and the nature of the website, it will need an appropriate hosting strategy and infrastructure. Websites can be hosted on a variety of systems, which can be broadly classified into:
- Static web hosts: Used for static websites, file storage based delivery platforms which provide domains, email, DNS and other features like SSL encryption and 3rd party integrations. E.g. Godaddy, Hostinger etc.
- Dynamic web hosts: For dynamic web apps, cloud platforms like AWS, Google Cloud, Azure, Salesforce and IBM cloud are preferred, which provide a range of computing features like Virtual Machines, Databases, on-demand resource scaling, etc. as services over the internet. These platforms are the standard for deploying web applications for business purposes because of their advanced performance and security features. They also provide a range of cutting edge AI/ML based tools.
A web application, even with a pretty simple initial setup, will tend to increase in complexity as more pages, content and functionality are added to it. After a certain threshold of complexity, it becomes very difficult to manage the behavior of the application and it gets more difficult to audit resource consumption and allocation.
Depending on the application, developers might prefer certain approaches to the application structure like:
- Single-page app: This refers to a web app which has a single document model within which all functionality is contained. It involves a large amount of functional logic having to be packaged and sent to the client’s computer, with appropriate security and performance optimizations. Some popular examples are Gmail, Facebook, GitHub etc.
- Multiple page app: Most commonly used structure, where the application is divided into multiple pages, accessible through various route URLs. Server-side app frameworks and template engines are used to build such applications, and they have an inherent security advantage over SPAs.
- Progressive web app: This is a modern way of using WebView providers on handheld devices to run web applications as native apps using service workers, manifests and an app shell.
No matter what approach you take, implementing test plans at the level of development and beyond in order to ensure an efficient DevOps workflow is equally important for all.
Testing web applications
The components of a good website testing plan include a strategy, test objectives, test approach, test schedule, and test environment. The test strategy should be designed to ensure that the website meets the business requirements and is fit for purpose.
The test approach should include:
- Unit Testing: Testing parts of the codebase through unit tests. Read more about unit testing in Javascript, Python. With the advent of no-code/low-code and AI-based tools you can now automate parts of the Unit test writing process also.
- Integration Testing: Involves testing various parts of the website code as interdependent functions or modules through test code or other tools triggered as the code is merged with the parent repository. All major cloud hosts like Github, Gitlab, Bitbucket have native support for CI/CD integration.
- System Testing: Testing the workings of the website at the level of user interface and features like login, signup, and other supported flows, which validates parts of the website working together. Selenium is the most widely adopted automation framework for web browsers.
- Acceptance Testing: This is usually the final stage of testing in which the fully assembled application with data is tested in a live or pre-production environment. This involves testing with actual or mock users. Automated visual testing is the most efficient way to manage change approval process on a rapidly changing product UI.
- Performance Testing: With an increase in users, it becomes vital to ensure that the servers can handle the request loads at peak usage times. Also, maintaining end-to-end security at each point of contact between the website and users through properly using HTTP headers and metadata analysis.
Role of web browsers
All websites need another application, known as the ‘browser’, running on top of the operating system of a device. Web browsers are built by many companies and are usually free.
Leading web browsers as of 2021
Web browsers play an important role, not only in making websites accessible to users, but also in helping the developers with a wide set of tools to test and debug various aspects of a web application under development. Most browsers provide development tools as an auxiliary interface to developers who want to peek under the hood and access the inner workings of a rendered web app. These tools can generally be accessed by right-clicking on a page and selecting the option ‘inspect’.
Let’s have a look at some of the primary tools provided in the dev tools section of Chrome:
- Elements: The element explorer provides access to the compiled DOM with a host of features to add/remove components, set states like hover, focus etc.
- Console: This is a log of the console output from the JavaScript execution, very useful for debugging. This panel can also be used to run JS code snippets on the active webpage and see the output.
- Sources: Under the sources panel you can see a list of all the source code files downloaded by the web page, listed under the domain provider name. On the right side of this tab, you have a script debugger which can be used to put breakpoints and debug the website execution in real-time.
- Network: This panel logs all the network calls to and from the webpage, with a long list of details like type, status, request/response, timing etc. It also has an option to simulate network availability scenarios by using the throttling feature.
- Performance: Under this panel, you get the ability to record runtime performance by recording page load events and analyzing the rich breakdown provided after the recording is analyzed.
- Memory: You can take memory heap snapshots and analyze things like memory leaks, object size, visualize memory usage over time etc.
- Application: This can be used to evaluate, edit and debug service workers, worker caches etc.
- Security: Gives you a summary of the SSL certificate validity.
There is also a built-in device toolbar which lets you simulate user interface scenarios across a range of devices with a list of preset resolution profiles, network throttle, zoom level, screen rotation and the ability to enter custom resolution for testing responsiveness.
The Chrome dev tools section is constantly getting newer and more advanced tools like Lighthouse, Recorder etc. which provide deeper ways to analyze the overall health of your application.
Furthermore, beyond the dev tools even, browsers let you access a library worth of APIs for things like I/O, camera, location, audio/speech, network, cookies etc. The DOM and many other components of an active session are available for access at the object level.
Testing local deployments
Before you move to set up a live testing flow in a browser, it is also recommended to implement a unit testing workflow in your application to ensure a strong and clean codebase.
When developers make a website, they check all changes and updates in a browser on their work computer. After preliminary functional testing, the next important part is the device and browser compatibility testing, which actually gives us a sense of how the app looks and performs on different device-browser combinations.
Because of the fact that a developer has limited access to the range of all possible devices, it becomes a bit trickier to capture some bugs. Despite the fact that web has well-defined standards and specifications, discrepancies in the interpretation of different browsers arise due to different implementations in the underlying browser engine and JS interpreter. Overall a modern web browser by itself provides a loaded toolkit to open up and debug any aspect of the application.
Automated Visual testing
From the perspective of a user, the application is a visual experience, and the journey a user takes on a website is defined by the design specifications set by the business analysis and design processes. To ensure that the developed website stays true to the design goals, one has to ensure a proper regiment of visual testing and approval of changes to the code base.
This is where the QA team plays a leading role. The process of visual testing used to be mostly manual about a decade ago but now, it is increasingly being automated. One efficient way to implement visual testing is by using tools like Percy in your stack which offer a user-friendly and tightly integrated way to perform visual tests on the CI/CD line.
You can add your team members and configure it for the production environment, where each build can be evaluated for visual differences and the team can collaborate on approvals.
We have laid out a clear path of testing on your local machine manually, with automation and also discussed how to implement visual testing to test and control UI changes. After you successfully implemented all these layers, your website is good to go to production or live environment.
But that does not mean that the role of testing is over. It just means that if your test cases are properly configured and your CI/CD flow runs smooth, you will have to spend less time on validations, error checks and more time on actually developing new features. To ensure a healthy testing system, just keep in mind these points:
Test runs at all stages and levels need to be monitored because automation will help you execute and detect errors, but to understand the severity of the defect and then find a way to resolve it, we need proper communication and allocation within the team.
Dashboards provide a great way to monitor the status of builds and test runs, but having too many separate dashboards will lead to difficulty in tracking and also waste time, so it is suggested that integration of layers wherever possible should be implemented.
Any change in business or design specifications has to flow from the leadership to the development process and finally into the test scripts. In a healthy dev process, the flow of information from top to bottom is seamless and quick.
There are many different approaches to testing websites and many different combinations of tools that can be applied, but with the new cloud-based technologies web testing has become more feature-rich and reliable. The ability to collaborate with the team on the same platform for building test plans supercharges the output as all members can access shared resources on the platform.
Top comments (0)