ClearBlade QA Metrics (Draft)
Overview
Make our QA measurable, define metrics so we can measure our QA progress over time. If we can measure QA, we can set objective goals for a new QA hire.
Definitions
Known Issue: A high priority unfixed bug that has the potential to compromise confidentiality, integrity, availability of the platform and/or edge.
Metrics
Known Issues
Labeling all significant bugs in MONSOON with Known Issues tags, and tracking the number of Known Issues (Label all backend bugs which seem to be a high priority)
JIRA search for known issues: https://clearblade.atlassian.net/issues/?filter=10844
Crash Reports
Evaluating ROI for using something similar to Crashlytics for Golang and send crash reports back home, reducing the number of crashes over time
- Automatically sends crash reports on any platform failure to ClearBlade for all instances
Value: - Allows us to be proactive and thus, ensure we take care of all bugs - Also, it helps solve the log-rotation problem, since important stuff is recorded ahead of time
Code Coverage
Code Coverage calculated and logged every CI build
Set Code Coverage Goal of 80% (Make note of tool used)
Gap analysis
Perform Gap Analysis for Features: https://docs.google.com/spreadsheets/d/15iNjOBTjbL6YrH3OfBpqD7k8SFYNByvV41TJYbnhZDc/edit#gid=0
Unit Testing
System Testing
Value: - Keeps us aware of untested features - Allows us to prioritize their testing - Helps us ensure that all our features are tested
Load Testing
High steady load, until failure
Ex. Publish 1,000 messages per second until the platform fails, maybe from a memory leak over course of 48hours of running
Stress Testing
Increase stress until failure
- Endurance Test Outputs for first 5 scenarios (already available)
Value: - Allows us to give numbers on how much stress can platform/edge sustain - Allows us to give Node-Specs to Customers - Helps us measure feature optimizations better - Helps us determine accurate numbers for parallelism
Security testing
- Performed by QA Security Testing is done to check how the software or application or website is secure from internal and external threats. This testing includes how much software is secure from the malicious program, viruses and how secure and strong the authorization and authentication processes are.
Network Testing
Authorization Testing
Verifying Ex: This also includes tests to find out bugs like developer logging in instead of user, when using ClearBlade Library.
Authentication Testing
Availability Testing
Resiiliency to DoS and DDoS. Ex: Rate Limit testing
Value: - Allows us to consider rate-limiting our end-points - Makes us more available - Continuous testing ensures we always adhere to the security rules when deploying new features
Devops Testing: Integration testing
Involves Smoke testing of all our components like HAProxy, CB Containers, Console, Exporters working together. Ex: Smoke testing that we get logs on the exporters, when we perform a blade-runner test on the platform. Current tests will be just smoke testing
Value: - Allows us to test any infrastructural changes we make
Customer Scenario Testing
- Make customer system scenarios easy to import
- Mock using blade-runners
- Stress and load testing these scenarios
- Metric:
- Number of customer scenarios we can mock
- How quickly can we mock a newly added customer
Value:
- Find the break-points before customers find them
- We can also demo these and also help customers test their new feature before they deploy on production
Allows us to recommend Node Specs for instances, based on customer’s use cases.
Testing Tools
Tentatively planning on using this built-in tool go test -cover
tentative goal of 80% code coverage for the clearblade repo.