Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Base Test Plan

Source: Confluence - Base Test Plan

This is our base test plan which covers the common things that we test on RC prior to the deployment to production.


Setup of RC for Testing

Install Tags to RC Server

Use Ansible to deploy the tags using inventory/releasecandidate:

ansible-playbook -i inventory/releasecandidate playbooks/release/sprint-release.yml --extra-vars "rc_release=192.0.0"

Test Categories

XGBoost

See: ds-modeling README

Path2Performance (DMRP)

See: ds-modeling README


P2A Layers on Databricks

Launch

  1. If new preselect functionality to test, configure the Order
    • Consider wiping old preselect configs that shouldn’t be tested every time
  2. Log into Orders App
  3. Go to order ORDER-12677
  4. Go to preselect page for ORDER-12678
  5. Update to the latest Household version
  6. Increment the Selection # on the History tab
  7. Select “Run on Databricks”, hit Accept on the Databricks estimate model
  8. Select Run

Verify

  1. Ensure run is successful all the way through
  2. Check output file headers are as expected
  3. Check output file contents are consistent with headers and values look reasonable
  4. For 1% households file reports:
    • Client Report Download works, all fields filled properly
    • Client Graph curve is generally smooth
    • Model grade is generally smooth
    • All reports display properly

P2A Layers with Density on Databricks

Launch

  1. If new preselect functionality to test, configure the Order
    • Don’t necessarily test same config twice between P2A Layers and this run
  2. Log into Orders App
  3. Go to order ORDER-19192
  4. Go to preselect page for ORDER-19192
  5. Update to the latest Household version
  6. Increment the Selection # on the History tab
  7. Select “Run on Databricks”, hit Accept on the Databricks estimate model
  8. Select Run

Verify

Same as P2A Layers above.


Hotline Site Visitors

Launch

  1. If new hotline functionality to test, configure the Order
  2. Log into Orders App
  3. Go to ORDER-18544
  4. Go to preselect page for ORDER-18544
  5. Update to the latest Household version
  6. Increment the Selection # on the History tab
  7. Select Run

Verify

  1. Ensure run is successful all the way through
  2. Check output file headers are as expected
  3. Check output file contents are consistent with headers
  4. Look at Segmentation Report - ensure it exists and is generally ordered correctly

Email Inquiry

Launch

  1. If new email inquiry functionality to test, configure the Order
  2. Log into Orders App
  3. Go to ORDER-36671
  4. Go to preselect page for ORDER-36671
  5. Update to the latest Household version
  6. Increment the Selection # on the History tab
  7. Select Run

Note: Test file generating ~30 records is at teacollection/60fb1280c6b86a4da074061d_testing

Verify

Same as Hotline Site Visitors above.


Customer Segmentation

Launch

  1. If new functionality to test, configure the Order
  2. Increment the Selection # on ORDER-21902
  3. Log into Orders App
  4. Go to page for ORDER-12677
  5. Update to the latest Household version
  6. Go to Customer Segmentation test model: ORDER-21902
  7. Select Run Customer Segmentation

Verify

Same as Hotline Site Visitors above.


Summary File (Reseller)

Summary File is run monthly with output delivered to specific clients (currently 4Cite). Contains individuals with transactions, donations, or subscriptions placed into percentile buckets based on spending.

Launch

If new reseller functionality to test, run from command line:

#!/bin/bash
set -ex
source ~/workspace/OrderProcessingEnvVars.sh
DATE=`date "+%Y-%m-%d"`
REPORTS_DIR=/opt/path2response/reports
APP_DIR=/opt/path2response/reseller-emr/bin
DB_APP_DIR=/usr/local/bin
rm -fr ~/reseller-append
mkdir -p ~/reseller-append/"$DATE"
cd ~/reseller-append/"$DATE"
$DB_APP_DIR/resellerappend-alt -u -l -o ~/reseller-append/"$DATE" -s --cluster l
node $REPORTS_DIR/summaryFileReports/summaryCounts.js --file ~/reseller-append/"$DATE"/reseller-append.csv.gz --counts ~/reseller-append/"$DATE"/reseller-append-counts.json.gz
$APP_DIR/resellerqc.sc -m -c ~/convertConfig2.json -r file:///home/chouk/reseller-append/"$DATE"/reseller-append.csv.gz -o ~/reseller-append/"$DATE" --counts-file ~/reseller-append/"$DATE"/reseller-append-counts.json.gz 2>&1 | tee ~/reseller-append/"$DATE"/qc.log
$APP_DIR/resellercounts.sc -c reseller-append-counts.json.gz -o cat_nmad-"$DATE"-counts.csv
mv reseller-append.csv.gz cat_nmad-"$DATE".csv.gz
ls ~/reseller-append/"$DATE"

Verify

  1. Ensure reseller run is successful all the way through
  2. Ensure reseller counts script is successful
  3. Check output file headers are as expected
  4. Check output file contents are consistent and values look reasonable

Fulfillment

Launch

  1. If fulfillment #1 exists for ORDER-12677, clear it out:
    • Edit ORDER-12677 in Jira, remove Fulfillment File 1 value
    • Refresh ORDER-12677 in Orders App
    • Go to fulfillment page #1, hit Reset and Start Over
  2. Log into Orders App
  3. Go to ORDER-12677
  4. Configure models with keycodes for desired fulfillment
  5. Go to fulfillment page for fulfillment #1
  6. Review fulfillment configuration
  7. Ensure “Run on Databricks” is checked
  8. Select Run Fulfillment

Verify

  1. Ensure run is successful all the way through
  2. Check output file headers are as expected
  3. Ensure results are as expected for desired use case
  4. Ensure sitevisit data has been uploaded to HDFS (if configured) and S3

Shipment

Launch

  1. Run fulfillment test first to ensure valid fulfillment file exists
  2. If shipment #1 exists for ORDER-12677, clear it out:
    • Edit ORDER-12677 in Jira, remove Fulfillment Ship Date 1 and Fulfillment History Data 1
    • Refresh ORDER-12677 in Orders App
    • Go to shipment page #1, hit Reset and Start Over
  3. Log into Orders App
  4. Go to ORDER-12677
  5. Go to shipment page for shipment #1
  6. Run Check File, ensure analysis passes (no duplications, line numbers as expected)
  7. Choose test shipment route (p2r.superbowl_kj)
  8. Ensure Contacts section does not contain client emails, contains checkable email
  9. Select Ship

Verify

Ensure it:

  1. Successfully zips the fulfillment file
  2. Successfully puts the file on FTP
  3. Successfully sends the shipment email
  4. Successfully updates Jira issue with Fulfillment Ship Date 1 and Fulfillment History Data 1
  5. Successfully puts copy of fulfillment in /mnt/data/prod/completed_fulfillments

Data Tools

Launch

  1. Login as datatools user to rc01
  2. Restart all Data Tools Services:
    cd /opt/datatools/
    node scripts/restartCLI.js stop
    node scripts/restartCLI.js start -b
    
  3. Set up testing files:
    • Remove prior test datafile mongo docs (superbowl client, ~6 docs)
    • Ensure files at p2r.files.com in /p2r.superbowl/incoming/path with spaces/
    • Copy test files if not available from /home/datatools/base_test_files
    • Copy 'test'_fulfillment_file.txt to /mnt/data/client/path2response/.load/
    • Manually retrieve base_test_house_and_transactions.zip and Product Upload 20190607.psv
  4. Map and convert test files:
    • House file: Send to builder, define mapping, trigger Production convert
    • Transaction file: Send to builder, define mapping, trigger Production convert, delete and reload

Verify

  1. Ensure MongoDB data is coming through (tab numbers populated)
  2. Ensure basic automation by following path2response mailfile through each stage
  3. Ensure EFS and S3 state reflects Data Tools state

Shiny Reports

Verify

Ensure these DRAX reports load successfully in RC and Production:

  1. Preselect History

    • Select title from dropdown, ensure all tabs load
    • Download housefile report successfully
    • Verify housefile report downloads from Data Tools
  2. Title Benchmark

    • Select title from dropdown, ensure all tabs load

Dashboards

Audit

Run as dashboards user:

cd /opt/dashboards/reports/
./audit.js -d

Ensure audit completes without errors (may have third-party service errors for Salesforce, QuickBooks, etc.)

Admin

If Google dependencies changed, log in with Google admin rights:

Reports

Run various reports from front-end UI:

ReportURLNotes
Revenue Report/reports/revenueBasic reporting
Recommendation Report/reports/recommendationOpen in Excel
Backlog Report/reports/backlog
Triage Report/reports/triage
Planning Report/reports/planning
Merged Report/reports/mergedTests Bitbucket

Service

Run as service-user (password in Ansible):