This page describes the test strategy on peviitor.ro project
TEST STRATEGY
Sec I. Scope of the project
peviitor.ro is a search engine where jobs are dynamically aggregated through scrapers and the core functionality is delivered by Apache SOLR through an API and a user interface written in elementary technologies.
Sec I.1. System description
Search engine components are :
Website - User interface
API - JSON feeds delivered by the backend
SOLR - Open source software that makes search based on text indexing possible
Scrapers - Components that make possible the retrieve of data from company websites and publish this information in the search engine
CloudFlare - CDN system that makes it possible for a large number of users to use the system without affecting system performance
Sec II. How to test the application ?
Testing on this project is done with volunteers who have the role of testers and will test the application from several points of view:
a) Functional tests
based on "Specifications"
based on "Test Cases"
using "Open Source" tools
b) Non-functional tests
performance tests
reliability tests
security tests
accessibility tests
Cross-Browser tests
tests on mobile phones (responsive design)
caching tests
data tests
tests for scrapers
Sec II.1. User Interface testing
The User Interface is presented in several versions, each version has specific particularities
User Interface testing belongs into "COMPONENT TESTING"
v01.peviitor.ro
User interface currently in production is version v01. Version v01 can be found archived at https://v01.peviitor.ro/
Source code of v01 can accessed thru GitHub at : https://github.com/sebiboga/peviitor
Bugs and Improvements can be submitted at: https://github.com/sebiboga/peviitor/issues
Test Case of v01 can be found thru Test Quality at : https://web.testquality.com/site/peviitor/project/12763/tests
Production server used for v01 is an server ngix from ClausWeb
v02.peviitor.ro
User Interface can be found archived at https://v02.peviitor.ro/
Source code of v01 can accessed thru GitHub at: https://github.com/peviitor-ro/ui
Bugs and Improvements can be submitted at: https://github.com/peviitor-ro/ui/issues
Notes ***)
this version was not released
was developed with ReactJS
v03.peviitor.ro
User Interface can be found archived at https://beta.peviitor.ro/
Source code of v01 can accessed thru GitHub at: https://github.com/peviitor-ro/ui-js
Bugs and Improvements can be submitted at: https://github.com/peviitor-ro/ui-js/issues
Test Case of v03 can be found thru Test Quality at : https://web.testquality.com/site/peviitor/project/12763/tests
Notes ***)
Builds are made locally and deployed manually thru GitHub ;
Deploy on the server it is done from the folder with the build
v03 was developed with JavaScript, HTML5 and CSS3
splash.peviitor.ro
Splash is the page that comes in place when a part (subsystem) becomes non-functional
Redirect should be instant after non- functionality is reported
Splash is a page published through Google Sites
Redirect should be cancelled after non- functionality is fixed
There is a configuration through which splash is presented when accessing peviitor.ro if you want to launch the new version at a certain date and time
api.peviitor.ro
API-ul can be accessed in SwaggerUI at : https://api.peviitor.ro/
Source code of API`s can accessed thru GitHub at: https://github.com/peviitor-ro/api
Bugs and Improvements can be submitted at: https://github.com/peviitor-ro/api/issues
Test Case of API`s can be found thru Test Quality at : https://web.testquality.com/site/peviitor/project/12763/tests
Server ngix from ClausWeb was used for API
Notes***)
API-ul developed in PHP ,wrapper over Apache SOLR
solr.peviitor.ro
Server SOLR ,server ON-PREMISE
SOLR.PEVIITOR.RO map at ZIMBOR.GO.RO on port 80
ZIMBOR.GO.RO delivered thru HTTP(digital certificate unavailable)
SOLR.PEVIITOR.RO is delivered with digital certificate from CloudFlare
CloudFlare
CloudFlare is the CDN system that does caching at delivery level throught his own datecenters
User used in CloudFlare is sebitestb@gmail.com
CloudFlare delivers UI, API and also SOLR
These components described above are tested independently in the first phase.
In the second phase the integration of the components is tested :UI - API - SOLR and this phase is called
"INTEGRATION TESTING"
ignorând toate componentele, userul începe "journey"-ul din Interfața Utilizator și verifică rezultatele chiar în Interfața Utilizator
acest tip de testare include testarea datelor din scrapere și verificarea existenței datelor în Interfața Utilizator
acest tip de testare include testarea funcționării pipeline-ului din GitHUB Actions
această etapă include "END-TO-END TESTING"
După testarea integrării componentelor, se trece în faza a 3-a de testare în care rulăm:
teste de performanță
teste de fiabilitate
teste de securitate
teste de accesibilitate
teste de Cross-Browser
teste pe telefoane mobile (responsive design)
teste de caching
teste pentru date
teste pentru scrapere
aceasta fiind etapa de "SYSTEM TESTING"
După testarea sistemului ca întreg, se trece în faza a 4-a de testare - validarea de business din partea utilizatorilor - în care rulăm:
Usability Testing
User Acceptance Testing
aceasta fiind etapa de "UAT TESTING"
După testarea sistemului în UAT, se trece în faza a 5-a de testare - pregătirea livrării în producție - în care rulăm:
verificarea codului - code-quality
finalizare/închidere branch-uri
rehersal spre Pre-Prod Environment
verificarea setărilor pentru producție
creare strategie de Final Deploy
activitățile de TEST CLOSURE
aceasta fiind etapa de "PRODUCTION TESTING"
Cap II.2. Activitățile de testare
Echipa de testare se va implica chiar din faza de descriere a cerintelor. Procesul de testare este organizat dupa cum urmeaza:
Test Analysis
activitate care are loc la inceputul definirii fiecarui "story";
aceasta activitate presupune si estimarea testarii;
se identifica fiecare "test scenario" care va fi implementat ca si Test Case separat;
Test Design
definim fiecare scenariu de testare in Test Case separat;
definim Test Data;
definim Test Plan-ul pentru fiecare "sprint" / iteratie;
Grupam Test Cases-urile in Test Management tool;
Generam raportul de Test Cases Report;
Test Execution and Reporting
executam Test Planul;
logam BUG-urile in Bug Tracking Tool;
generam documentul Test Execution Report;
trimitem echipei rezultatele in urma executiei TC-urilor;
RETEST
retestam bug-urile fixate;
Regression Testing
include activitatile de validare a unui build;
include activitatile de rulare a Test Case-urilor pentru un build specific;
in urma rezultatelor si pe baza Exit Criteria, buildul se declara Release Candidate;
Test Closure
se intocmeste documentul de Test Closure pentru fiecare Release Candidate;
se includ toate Known Issues;
se precizeaza tipurile de testare efectuare si se ataseaza pentru fiecare tip de testare, test results;
se livreaza odata cu Test Execution Report in momentul definirii unui Release Candidate;
Status Report
se livreaza in fiecare zi la finalul activitatii de testare;
cuprinde toate bugurile gasite noi, inchise, redeschise;
cuprinde Test Case-urile rulate;
cuprinde un raport centralizator in procente a procesului de testare referitor la Test Cases (RUN %, not-RUN%, PASS%, FAILED%, BLOCKED%)
cuprinde numarul / numele buildului in testare;
cuprinde numele Release Candidate-ului in caz ca a fost identificat unul;
se trimite de catre echipa de testare;
Cap II.3. Scopul testării
Cap II.3.1 In scop
Testare functionala;
Testare automata;
Teste de performanta;
Teste de securitate;
Teste pe diferite browsere;
Teste pe device-uri mobile;
Cap II.3.2 In afara scopului
Recovery from disaster;
Accessibility testing;
SEO testing;
Unit testing;
Cap II.4. Orarul de livrare
TBC [angajamente de livrare a versiunii v03 -- Sergiu Avram (cod); Niculina Moldovan(test cases)]
Cap II.5. Environments
Cap II.5.1 DEV environment
laptopul developerului;
fiecare trebuie sa poata sa isi aduca local SOLR, API si UI pentru versiunea in dezvoltare; (kit de a instala local toata solutia)
Cap II.5.2 BETA environment
se valideaza UI-ul si integrarea solutiei cu API-ul si SOLR-ul;
Cap II.5.3 TEST environment
se valideaza DOAR datele aduse de catre scrapere;
Cap II.5.4 PRODUCTION environment
Cap II.6. Exit Criteria
100% Test Cases run
96% Test Cases PASS
no open bug with HIGH severity
96% of bugs fixed & closed with low severity
Cap III. Defect management process
Cap III.1. Bug Lifecycle
Cap III.2. Severity
high severity
normal severity
low severity
Cap IV. Echipa de testare
Echipa de testare este condusă de un Team-Lead voluntar - Niculina Moldovan.
În cadrul echipei de testare avem:
Lenuța - tester pentru date
Lorena Berchesan - tester pentru scrapere
Dan Păun - tester pentru Interfața Utilizator
Olivian - tester pentru telefoanele mobile
Adam Irina - tester
Cap V. Tool-uri
POSTMAN
Apache JMeter
TestQuality https://peviitor.testquality.com/login
GitHUB Issues
Figma https://www.figma.com/file/ZnxmiUT0MBUvGFv2D3vaZz/Website-designs?node-id=202%3A586
Cap VI. Livrabile
Test Strategy
Test Plan
Test Cases
Test Report
Status Report
Lista de buguri (Raportul cu bugurile existente deschise si inchise)
Counter de locuri de muncă