Base Test Plan
Source: Confluence - Base Test Plan
This is our base test plan which covers the common things that we test on RC prior to the deployment to production.
Setup of RC for Testing
Install Tags to RC Server
Use Ansible to deploy the tags using inventory/releasecandidate:
ansible-playbook -i inventory/releasecandidate playbooks/release/sprint-release.yml --extra-vars "rc_release=192.0.0"
Test Categories
XGBoost
See: ds-modeling README
Path2Performance (DMRP)
See: ds-modeling README
P2A Layers on Databricks
Launch
- If new preselect functionality to test, configure the Order
- Consider wiping old preselect configs that shouldn’t be tested every time
- Log into Orders App
- Go to order ORDER-12677
- Go to preselect page for ORDER-12678
- Update to the latest Household version
- Increment the Selection # on the History tab
- Select “Run on Databricks”, hit Accept on the Databricks estimate model
- Select Run
Verify
- Ensure run is successful all the way through
- Check output file headers are as expected
- Check output file contents are consistent with headers and values look reasonable
- For 1% households file reports:
- Client Report Download works, all fields filled properly
- Client Graph curve is generally smooth
- Model grade is generally smooth
- All reports display properly
P2A Layers with Density on Databricks
Launch
- If new preselect functionality to test, configure the Order
- Don’t necessarily test same config twice between P2A Layers and this run
- Log into Orders App
- Go to order ORDER-19192
- Go to preselect page for ORDER-19192
- Update to the latest Household version
- Increment the Selection # on the History tab
- Select “Run on Databricks”, hit Accept on the Databricks estimate model
- Select Run
Verify
Same as P2A Layers above.
Hotline Site Visitors
Launch
- If new hotline functionality to test, configure the Order
- Log into Orders App
- Go to ORDER-18544
- Go to preselect page for ORDER-18544
- Update to the latest Household version
- Increment the Selection # on the History tab
- Select Run
Verify
- Ensure run is successful all the way through
- Check output file headers are as expected
- Check output file contents are consistent with headers
- Look at Segmentation Report - ensure it exists and is generally ordered correctly
Email Inquiry
Launch
- If new email inquiry functionality to test, configure the Order
- Log into Orders App
- Go to ORDER-36671
- Go to preselect page for ORDER-36671
- Update to the latest Household version
- Increment the Selection # on the History tab
- Select Run
Note: Test file generating ~30 records is at teacollection/60fb1280c6b86a4da074061d_testing
Verify
Same as Hotline Site Visitors above.
Customer Segmentation
Launch
- If new functionality to test, configure the Order
- Increment the Selection # on ORDER-21902
- Log into Orders App
- Go to page for ORDER-12677
- Update to the latest Household version
- Go to Customer Segmentation test model: ORDER-21902
- Select Run Customer Segmentation
Verify
Same as Hotline Site Visitors above.
Summary File (Reseller)
Summary File is run monthly with output delivered to specific clients (currently 4Cite). Contains individuals with transactions, donations, or subscriptions placed into percentile buckets based on spending.
Launch
If new reseller functionality to test, run from command line:
#!/bin/bash
set -ex
source ~/workspace/OrderProcessingEnvVars.sh
DATE=`date "+%Y-%m-%d"`
REPORTS_DIR=/opt/path2response/reports
APP_DIR=/opt/path2response/reseller-emr/bin
DB_APP_DIR=/usr/local/bin
rm -fr ~/reseller-append
mkdir -p ~/reseller-append/"$DATE"
cd ~/reseller-append/"$DATE"
$DB_APP_DIR/resellerappend-alt -u -l -o ~/reseller-append/"$DATE" -s --cluster l
node $REPORTS_DIR/summaryFileReports/summaryCounts.js --file ~/reseller-append/"$DATE"/reseller-append.csv.gz --counts ~/reseller-append/"$DATE"/reseller-append-counts.json.gz
$APP_DIR/resellerqc.sc -m -c ~/convertConfig2.json -r file:///home/chouk/reseller-append/"$DATE"/reseller-append.csv.gz -o ~/reseller-append/"$DATE" --counts-file ~/reseller-append/"$DATE"/reseller-append-counts.json.gz 2>&1 | tee ~/reseller-append/"$DATE"/qc.log
$APP_DIR/resellercounts.sc -c reseller-append-counts.json.gz -o cat_nmad-"$DATE"-counts.csv
mv reseller-append.csv.gz cat_nmad-"$DATE".csv.gz
ls ~/reseller-append/"$DATE"
Verify
- Ensure reseller run is successful all the way through
- Ensure reseller counts script is successful
- Check output file headers are as expected
- Check output file contents are consistent and values look reasonable
Fulfillment
Launch
- If fulfillment #1 exists for ORDER-12677, clear it out:
- Edit ORDER-12677 in Jira, remove Fulfillment File 1 value
- Refresh ORDER-12677 in Orders App
- Go to fulfillment page #1, hit Reset and Start Over
- Log into Orders App
- Go to ORDER-12677
- Configure models with keycodes for desired fulfillment
- Go to fulfillment page for fulfillment #1
- Review fulfillment configuration
- Ensure “Run on Databricks” is checked
- Select Run Fulfillment
Verify
- Ensure run is successful all the way through
- Check output file headers are as expected
- Ensure results are as expected for desired use case
- Ensure sitevisit data has been uploaded to HDFS (if configured) and S3
Shipment
Launch
- Run fulfillment test first to ensure valid fulfillment file exists
- If shipment #1 exists for ORDER-12677, clear it out:
- Edit ORDER-12677 in Jira, remove Fulfillment Ship Date 1 and Fulfillment History Data 1
- Refresh ORDER-12677 in Orders App
- Go to shipment page #1, hit Reset and Start Over
- Log into Orders App
- Go to ORDER-12677
- Go to shipment page for shipment #1
- Run Check File, ensure analysis passes (no duplications, line numbers as expected)
- Choose test shipment route (p2r.superbowl_kj)
- Ensure Contacts section does not contain client emails, contains checkable email
- Select Ship
Verify
Ensure it:
- Successfully zips the fulfillment file
- Successfully puts the file on FTP
- Successfully sends the shipment email
- Successfully updates Jira issue with Fulfillment Ship Date 1 and Fulfillment History Data 1
- Successfully puts copy of fulfillment in /mnt/data/prod/completed_fulfillments
Data Tools
Launch
- Login as
datatoolsuser torc01 - Restart all Data Tools Services:
cd /opt/datatools/ node scripts/restartCLI.js stop node scripts/restartCLI.js start -b - Set up testing files:
- Remove prior test
datafilemongo docs (superbowl client, ~6 docs) - Ensure files at
p2r.files.comin/p2r.superbowl/incoming/path with spaces/ - Copy test files if not available from
/home/datatools/base_test_files - Copy
'test'_fulfillment_file.txtto/mnt/data/client/path2response/.load/ - Manually retrieve
base_test_house_and_transactions.zipandProduct Upload 20190607.psv
- Remove prior test
- Map and convert test files:
- House file: Send to builder, define mapping, trigger Production convert
- Transaction file: Send to builder, define mapping, trigger Production convert, delete and reload
Verify
- Ensure MongoDB data is coming through (tab numbers populated)
- Ensure basic automation by following path2response mailfile through each stage
- Ensure EFS and S3 state reflects Data Tools state
Shiny Reports
Verify
Ensure these DRAX reports load successfully in RC and Production:
-
- Select title from dropdown, ensure all tabs load
- Download housefile report successfully
- Verify housefile report downloads from Data Tools
-
- Select title from dropdown, ensure all tabs load
Dashboards
Audit
Run as dashboards user:
cd /opt/dashboards/reports/
./audit.js -d
Ensure audit completes without errors (may have third-party service errors for Salesforce, QuickBooks, etc.)
Admin
If Google dependencies changed, log in with Google admin rights:
- Go to dashboards.path2response.com/admin
- Click “Generate Groups for Users”
Reports
Run various reports from front-end UI:
| Report | URL | Notes |
|---|---|---|
| Revenue Report | /reports/revenue | Basic reporting |
| Recommendation Report | /reports/recommendation | Open in Excel |
| Backlog Report | /reports/backlog | |
| Triage Report | /reports/triage | |
| Planning Report | /reports/planning | |
| Merged Report | /reports/merged | Tests Bitbucket |
Service
Run as service-user (password in Ansible):
| Service | URL |
|---|---|
| Client Results | /services/client_results?json=true |
| Jira Users | /services/jira_users |
| Jira User (by name) | /services/jira_users/jmalone |
| Jira User (by ID) | /services/jira_users/557058:fc90fdb7-fc07-4b78-b1c2-3619db1b85b1 |