You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Review that everything is okay and see if advisories are deduped (reduced). We had 119 million advisories earlier now we have 18 million advisories after running the dedupe pipeline
And deploy on production
Add advisory ID
Add advisory ID field to Advisory model, create schema migration
Move url field position just below the advisory_id field.
Add improver pipeline to populate advisory ID, each advisory created_by different importers implies a different treatment to determine the advisory ID in one of the aliases, the URL or the references.
Update all importers and improvers to account for the new advisory ID field. (import_runner and improve_runner as well)
Affected Packages: Create a relationship between a package and advisory and migrate
References: Create AdvisoryReferences, and migrate
Severities: Severities needs to be refactored. Create new advisory severities. So they do not go through references. WIll be like VulnerabilitySeverity but will be directly associated with an advisory
How to decide advisory ID when all importers share exact same aliases. for example 2 importers only have alias: CVE-XXXX-YYYY, then what should be the heuristic?
Ans: Advisory ID will not be a unique field, but will be part of a unique together: (url, advisory_id, created_by etc...)
Complete the migration and API on the basis of data models.
Validate and deploy advisories dedupe
Add advisory ID
Add other fields ...
Design how to relate to a vulnerability
Update API (v2) and UI.
Remove old models, old fields and old data.
QnA
How to decide advisory ID when all importers share exact same aliases. for example 2 importers only have alias: CVE-XXXX-YYYY, then what should be the heuristic?
Ans: Advisory ID will not be a unique field, but will be part of a unique together: (url, advisory_id, created_by etc...)
Complete the migration and API on the basis of data models.