From patchwork Sat Mar 21 13:18:23 2026 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Stefano Tondo X-Patchwork-Id: 84046 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id EDA82109447A for ; Sat, 21 Mar 2026 13:18:36 +0000 (UTC) Received: from mail-wr1-f42.google.com (mail-wr1-f42.google.com [209.85.221.42]) by mx.groups.io with SMTP id smtpd.msgproc02-g2.10764.1774099112924997085 for ; Sat, 21 Mar 2026 06:18:33 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=nULcCCYi; spf=pass (domain: gmail.com, ip: 209.85.221.42, mailfrom: stondo@gmail.com) Received: by mail-wr1-f42.google.com with SMTP id ffacd0b85a97d-43b45bb7548so1953831f8f.1 for ; Sat, 21 Mar 2026 06:18:32 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1774099111; x=1774703911; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=hGrcPqL9C0bbk0sXnYohSPPgtIrkL3FANjW4OVzrFsA=; b=nULcCCYiEAMDpeeMWpJMznG/tfJxjqwE0tS4sAajpzE1vWqhzTSyPWwoiyhUU0rBHM cWgq7aWCBQoDygguWgv3SNTrRMP/+PyaXVf1n721PiDbPTSpWnupw7cMVmiRBV3SBv3/ w88yPGn8fWYGpBxg4SzMXO3YVrRp6GGZQwAqAayv6Ho9DB+PtmpBgYI/ioPJ0ux8Q9yg auSU9ayoRXlfjN+mAOAW1CQmppKl3Ghkze3FlxGy45qxPpqsE/KogjQcj7qHVwrfrsOp RXoGQZamRtUtYgU/WWHNZbqCACGqBCxA5Mm+X9/34FFB5yST6r+EJMaVu6pZ14CIZAb3 eNDw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20251104; t=1774099111; x=1774703911; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-gg:x-gm-message-state:from :to:cc:subject:date:message-id:reply-to; bh=hGrcPqL9C0bbk0sXnYohSPPgtIrkL3FANjW4OVzrFsA=; b=rcL6PJHdFrwQj2zZ78KAtmhGDWCFAdwTAMjvQXwi+laQ+kgv023+AXpjBOa9YsAHQI GIa33PonpI1QKP8zDFxPxGFTjxn7G/kvXDa5Tilh7tkWQ16c+i7Ihf28AcbP4aNdwTig 2IUb1w2YcaPSmGcgpgtsHj5ZDIsXDYWK3tG5cWkxOqYBejnIPdCMk733SnzUviuBkml/ LmM2eDGCS2sn6EqGv2lqnV5NvSPoCsRTedPNoVvGgMNXtjETZhEfr/tgsaPU6H0apyA/ SaAAlTwWeSNee8H6nXrBZ6LyiUaaTEaZtTjeNx6IlNbZltL0b2CPc30/5GFI1ypD85aW +YPw== X-Gm-Message-State: AOJu0YxDYjANbgkl1TOSSeufxtL75vXHy5YxhYwM7b/VyhOaYcLGSEJZ sR5bJ5WcSWaZdSLyeKwvgAhGZ6IbNWpvkSZlsEnUKwQT18rwT0OdN65yWmcBy4YW X-Gm-Gg: ATEYQzwLp+oVDuH2CuqE3qx3wydu8XKaM3GT3RR3DgDYpieWRSlg6cgV8lUuOLuau6l SfWx55aOA0C0JS6ilPnANXEXfQaVFMpypjaW/+hd0gPxGpn8yJpIeBV9zy/4Bcxia8OIk0Pkqw9 90ODPMLuY9o9+PL1kKhDJpm1DS0j7OozX+4u77TRemBTk4NDn6hrhBvhTECvqyvNVLXQpNy1LvP 6F1JyqWSfOnZ6sTsdHFkYnYMAJHsVn1sDVxLwZxbpTvG3E81zzI+nM+8fyVMp0lGkLcvNobDYFL HJXoKSi5nVH6PCQAOu5x+byxjpTmuyICP3wRdSCu3s9Iy+52twbu4RfoWVIIfEJMxp5KAn5A8ZU LRoEtlhOrniVlNc3HMmi+lYT9EIcxWLdw7TORV4ULwwjp/spWoMewbnewoGVwCE2NQQQ8wJp/MY C2Y/0S4HRCEEXqJr5ShJXgpklsSS9jFe11L9s= X-Received: by 2002:a5d:5c84:0:b0:43b:4136:1e76 with SMTP id ffacd0b85a97d-43b64278280mr10653480f8f.29.1774099110688; Sat, 21 Mar 2026 06:18:30 -0700 (PDT) Received: from fedora ([81.6.40.67]) by smtp.gmail.com with ESMTPSA id ffacd0b85a97d-43b644bf1c5sm15004285f8f.14.2026.03.21.06.18.28 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sat, 21 Mar 2026 06:18:29 -0700 (PDT) From: stondo@gmail.com To: openembedded-core@lists.openembedded.org Cc: Ross.Burton@arm.com, jpewhacker@gmail.com, stefano.tondo.ext@siemens.com, Peter.Marko@siemens.com, adrian.freihofer@siemens.com, mathieu.dubois-briand@bootlin.com Subject: [OE-core][PATCH v11 1/4] spdx30: Add configurable file exclusion pattern support Date: Sat, 21 Mar 2026 14:18:23 +0100 Message-ID: <20260321131826.1401671-2-stondo@gmail.com> X-Mailer: git-send-email 2.53.0 In-Reply-To: <20260321131826.1401671-1-stondo@gmail.com> References: <20260321131826.1401671-1-stondo@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from 45-33-107-173.ip.linodeusercontent.com [45.33.107.173] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Sat, 21 Mar 2026 13:18:36 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/233658 From: Stefano Tondo Add SPDX_FILE_EXCLUDE_PATTERNS variable that allows filtering files from SPDX output by regex matching. The variable accepts a space-separated list of Python regular expressions; files whose paths match any pattern (via re.search) are excluded. When empty (the default), no filtering is applied and all files are included, preserving existing behavior. This enables users to reduce SBOM size by excluding files that are not relevant for compliance (e.g., test files, object files, patches). Excluded files are tracked in a set returned from add_package_files() and passed to get_package_sources_from_debug(), which uses the set for precise cross-checking rather than re-evaluating patterns. Signed-off-by: Stefano Tondo --- meta/classes-recipe/cargo_common.bbclass | 3 + meta/classes-recipe/cpan.bbclass | 11 + meta/classes-recipe/go-mod.bbclass | 6 + meta/classes-recipe/npm.bbclass | 7 + meta/classes-recipe/pypi.bbclass | 6 +- meta/classes/spdx-common.bbclass | 7 + meta/lib/oe/spdx30_tasks.py | 667 ++++++++++++----------- 7 files changed, 375 insertions(+), 332 deletions(-) diff --git a/meta/classes-recipe/cargo_common.bbclass b/meta/classes-recipe/cargo_common.bbclass index bc44ad7918..0d3edfe4a7 100644 --- a/meta/classes-recipe/cargo_common.bbclass +++ b/meta/classes-recipe/cargo_common.bbclass @@ -240,3 +240,6 @@ EXPORT_FUNCTIONS do_configure # https://github.com/rust-lang/libc/issues/3223 # https://github.com/rust-lang/libc/pull/3175 INSANE_SKIP:append = " 32bit-time" + +# Generate ecosystem-specific Package URL for SPDX +SPDX_PACKAGE_URLS =+ "pkg:cargo/${BPN}@${PV} " diff --git a/meta/classes-recipe/cpan.bbclass b/meta/classes-recipe/cpan.bbclass index bb76a5b326..87ebed124a 100644 --- a/meta/classes-recipe/cpan.bbclass +++ b/meta/classes-recipe/cpan.bbclass @@ -68,4 +68,15 @@ cpan_do_install () { done } +# Generate ecosystem-specific Package URL for SPDX +def cpan_spdx_name(d): + bpn = d.getVar('BPN') + if bpn.startswith('perl-'): + return bpn[5:] + elif bpn.startswith('libperl-'): + return bpn[8:] + return bpn + +SPDX_PACKAGE_URLS =+ "pkg:cpan/${@cpan_spdx_name(d)}@${PV} " + EXPORT_FUNCTIONS do_configure do_compile do_install diff --git a/meta/classes-recipe/go-mod.bbclass b/meta/classes-recipe/go-mod.bbclass index a15dda8f0e..5b3cb2d8b9 100644 --- a/meta/classes-recipe/go-mod.bbclass +++ b/meta/classes-recipe/go-mod.bbclass @@ -32,3 +32,9 @@ do_compile[dirs] += "${B}/src/${GO_WORKDIR}" # Make go install unpack the module zip files in the module cache directory # before the license directory is polulated with license files. addtask do_compile before do_populate_lic + +# Generate ecosystem-specific Package URL for SPDX +SPDX_PACKAGE_URLS =+ "pkg:golang/${GO_IMPORT}@${PV} " + +# Generate ecosystem-specific Package URL for SPDX +SPDX_PACKAGE_URLS =+ "pkg:golang/${GO_IMPORT}@${PV} " diff --git a/meta/classes-recipe/npm.bbclass b/meta/classes-recipe/npm.bbclass index 344e8b4bec..7bb791d543 100644 --- a/meta/classes-recipe/npm.bbclass +++ b/meta/classes-recipe/npm.bbclass @@ -354,4 +354,11 @@ FILES:${PN} += " \ ${nonarch_libdir} \ " +# Generate ecosystem-specific Package URL for SPDX +def npm_spdx_name(d): + bpn = d.getVar('BPN') + return bpn[5:] if bpn.startswith('node-') else bpn + +SPDX_PACKAGE_URLS =+ "pkg:npm/${@npm_spdx_name(d)}@${PV} " + EXPORT_FUNCTIONS do_configure do_compile do_install diff --git a/meta/classes-recipe/pypi.bbclass b/meta/classes-recipe/pypi.bbclass index 9d46c035f6..e2d054af6d 100644 --- a/meta/classes-recipe/pypi.bbclass +++ b/meta/classes-recipe/pypi.bbclass @@ -43,7 +43,8 @@ SECTION = "devel/python" SRC_URI:prepend = "${PYPI_SRC_URI} " S = "${UNPACKDIR}/${PYPI_PACKAGE}-${PV}" -UPSTREAM_CHECK_PYPI_PACKAGE ?= "${PYPI_PACKAGE}" +# Replace any '_' characters in the pypi URI with '-'s to follow the PyPi website naming conventions +UPSTREAM_CHECK_PYPI_PACKAGE ?= "${@pypi_normalize(d)}" # Use the simple repository API rather than the potentially unstable project URL # More information on the pypi API specification is avaialble here: @@ -54,3 +55,6 @@ UPSTREAM_CHECK_URI ?= "https://pypi.org/simple/${@pypi_normalize(d)}/" UPSTREAM_CHECK_REGEX ?= "${UPSTREAM_CHECK_PYPI_PACKAGE}-(?P(\d+[\.\-_]*)+).(tar\.gz|tgz|zip|tar\.bz2)" CVE_PRODUCT ?= "python:${PYPI_PACKAGE}" + +# Generate ecosystem-specific Package URL for SPDX +SPDX_PACKAGE_URLS =+ "pkg:pypi/${@pypi_normalize(d)}@${PV} " diff --git a/meta/classes/spdx-common.bbclass b/meta/classes/spdx-common.bbclass index 83f05579b6..40701730a6 100644 --- a/meta/classes/spdx-common.bbclass +++ b/meta/classes/spdx-common.bbclass @@ -82,6 +82,13 @@ SPDX_MULTILIB_SSTATE_ARCHS[doc] = "The list of sstate architectures to consider when collecting SPDX dependencies. This includes multilib architectures when \ multilib is enabled. Defaults to SSTATE_ARCHS." +SPDX_FILE_EXCLUDE_PATTERNS ??= "" +SPDX_FILE_EXCLUDE_PATTERNS[doc] = "Space-separated list of Python regular \ + expressions to exclude files from SPDX output. Files whose paths match \ + any pattern (via re.search) will be filtered out. Defaults to empty \ + (no filtering). Example: \ + SPDX_FILE_EXCLUDE_PATTERNS = '\\.patch$ \\.diff$ /test/ \\.pyc$ \\.o$'" + python () { from oe.cve_check import extend_cve_status extend_cve_status(d) diff --git a/meta/lib/oe/spdx30_tasks.py b/meta/lib/oe/spdx30_tasks.py index 353d783fa2..b94868dc87 100644 --- a/meta/lib/oe/spdx30_tasks.py +++ b/meta/lib/oe/spdx30_tasks.py @@ -13,6 +13,8 @@ import oe.spdx30 import oe.spdx_common import oe.sdk import os +import re +import urllib.parse from contextlib import contextmanager from datetime import datetime, timezone @@ -32,9 +34,7 @@ def set_timestamp_now(d, o, prop): delattr(o, prop) -def add_license_expression( - d, objset, license_expression, license_data, search_objsets=[] -): +def add_license_expression(d, objset, license_expression, license_data): simple_license_text = {} license_text_map = {} license_ref_idx = 0 @@ -46,15 +46,14 @@ def add_license_expression( if name in simple_license_text: return simple_license_text[name] - for o in [objset] + search_objsets: - lic = o.find_filter( - oe.spdx30.simplelicensing_SimpleLicensingText, - name=name, - ) + lic = objset.find_filter( + oe.spdx30.simplelicensing_SimpleLicensingText, + name=name, + ) - if lic is not None: - simple_license_text[name] = lic - return lic + if lic is not None: + simple_license_text[name] = lic + return lic lic = objset.add( oe.spdx30.simplelicensing_SimpleLicensingText( @@ -148,42 +147,36 @@ def add_package_files( ignore_dirs=[], ignore_top_level_dirs=[], ): - source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") - if source_date_epoch: - source_date_epoch = int(source_date_epoch) - - spdx_files = set() - - file_counter = 1 - if not os.path.exists(topdir): - bb.note(f"Skip {topdir}") - return spdx_files - - check_compiled_sources = d.getVar("SPDX_INCLUDE_COMPILED_SOURCES") == "1" - if check_compiled_sources: - compiled_sources, types = oe.spdx_common.get_compiled_sources(d) - bb.debug(1, f"Total compiled files: {len(compiled_sources)}") - - for subdir, dirs, files in os.walk(topdir, onerror=walk_error): - dirs[:] = [d for d in dirs if d not in ignore_dirs] - if subdir == str(topdir): - dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] - - dirs.sort() - files.sort() - for file in files: - filepath = Path(subdir) / file + if os.path.isdir(image_path): + a, _ = add_package_files( + d, + objset, + image_path, + lambda file_counter: objset.new_spdxid( + "imagefile", str(file_counter) + ), + lambda filepath: [], + license_data=None, + ignore_dirs=[], + ignore_top_level_dirs=[], + archive=None, + ) if filepath.is_symlink() or not filepath.is_file(): continue filename = str(filepath.relative_to(topdir)) + + # Apply file exclusion filtering + if exclude_patterns: + if any(p.search(filename) for p in exclude_patterns): + excluded_files.add(filename) + continue + file_purposes = get_purposes(filepath) # Check if file is compiled if check_compiled_sources: - if not oe.spdx_common.is_compiled_source( - filename, compiled_sources, types - ): + if not oe.spdx_common.is_compiled_source(filename, compiled_sources, types): continue spdx_file = objset.new_file( @@ -218,12 +211,15 @@ def add_package_files( bb.debug(1, "Added %d files to %s" % (len(spdx_files), objset.doc._id)) - return spdx_files + return spdx_files, excluded_files def get_package_sources_from_debug( - d, package, package_files, sources, source_hash_cache + d, package, package_files, sources, source_hash_cache, excluded_files=None ): + if excluded_files is None: + excluded_files = set() + def file_path_match(file_path, pkg_file): if file_path.lstrip("/") == pkg_file.name.lstrip("/"): return True @@ -256,6 +252,12 @@ def get_package_sources_from_debug( continue if not any(file_path_match(file_path, pkg_file) for pkg_file in package_files): + if file_path.lstrip("/") in excluded_files: + bb.debug( + 1, + f"Skipping debug source lookup for excluded file {file_path} in {package}", + ) + continue bb.fatal( "No package file found for %s in %s; SPDX found: %s" % (str(file_path), package, " ".join(p.name for p in package_files)) @@ -298,14 +300,17 @@ def get_package_sources_from_debug( return dep_source_files -def collect_dep_objsets(d, direct_deps, subdir, fn_prefix, obj_type, **attr_filter): +def collect_dep_objsets(d, build): + deps = oe.spdx_common.get_spdx_deps(d) + dep_objsets = [] - dep_objs = set() + dep_builds = set() - for dep in direct_deps: + dep_build_spdxids = set() + for dep in deps: bb.debug(1, "Fetching SPDX for dependency %s" % (dep.pn)) - dep_obj, dep_objset = oe.sbom30.find_root_obj_in_jsonld( - d, subdir, fn_prefix + dep.pn, obj_type, **attr_filter + dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld( + d, "recipes", "recipe-" + dep.pn, oe.spdx30.build_Build ) # If the dependency is part of the taskhash, return it to be linked # against. Otherwise, it cannot be linked against because this recipe @@ -313,10 +318,10 @@ def collect_dep_objsets(d, direct_deps, subdir, fn_prefix, obj_type, **attr_filt if dep.in_taskhash: dep_objsets.append(dep_objset) - # The object _can_ be linked against (by alias) - dep_objs.add(dep_obj) + # The build _can_ be linked against (by alias) + dep_builds.add(dep_build) - return dep_objsets, dep_objs + return dep_objsets, dep_builds def index_sources_by_hash(sources, dest): @@ -359,6 +364,120 @@ def collect_dep_sources(dep_objsets, dest): index_sources_by_hash(e.to, dest) +def _generate_git_purl(d, download_location, srcrev): + """Generate a Package URL for a Git source from its download location. + + Parses the Git URL to identify the hosting service and generates the + appropriate PURL type. Supports github.com by default and custom + mappings via SPDX_GIT_PURL_MAPPINGS. + + Returns the PURL string or None if no mapping matches. + """ + if not download_location or not download_location.startswith('git+'): + return None + + git_url = download_location[4:] # Remove 'git+' prefix + + # Default handler: github.com + git_purl_handlers = { + 'github.com': 'pkg:github', + } + + # Custom PURL mappings from SPDX_GIT_PURL_MAPPINGS + # Format: "domain1:purl_type1 domain2:purl_type2" + custom_mappings = d.getVar('SPDX_GIT_PURL_MAPPINGS') + if custom_mappings: + for mapping in custom_mappings.split(): + parts = mapping.split(':', 1) + if len(parts) == 2: + git_purl_handlers[parts[0]] = parts[1] + bb.debug(2, f"Added custom Git PURL mapping: {parts[0]} -> {parts[1]}") + else: + bb.warn(f"Invalid SPDX_GIT_PURL_MAPPINGS entry: {mapping} (expected format: domain:purl_type)") + + try: + parsed = urllib.parse.urlparse(git_url) + except Exception: + return None + + hostname = parsed.hostname + if not hostname: + return None + + for domain, purl_type in git_purl_handlers.items(): + if hostname == domain: + path = parsed.path.strip('/') + path_parts = path.split('/') + if len(path_parts) >= 2: + owner = path_parts[0] + repo = path_parts[1].replace('.git', '') + return f"{purl_type}/{owner}/{repo}@{srcrev}" + break + + return None + + +def _enrich_source_package(d, dl, fd, file_name, primary_purpose): + """Enrich a source download package with version, PURL, and external refs. + + Extracts version from SRCREV for Git sources, generates PURLs for + known hosting services, and adds external references for VCS, + distribution URLs, and homepage. + """ + version = None + purl = None + + if fd.type == "git": + # Use full SHA-1 from fd.revision + srcrev = getattr(fd, 'revision', None) + if srcrev and srcrev not in {'${AUTOREV}', 'AUTOINC', 'INVALID'}: + version = srcrev + + # Generate PURL for Git hosting services + download_location = getattr(dl, 'software_downloadLocation', None) + if version and download_location: + purl = _generate_git_purl(d, download_location, version) + else: + # Use ecosystem PURL from SPDX_PACKAGE_URLS if available + package_urls = (d.getVar('SPDX_PACKAGE_URLS') or '').split() + for url in package_urls: + if not url.startswith('pkg:yocto'): + purl = url + break + + if version: + dl.software_packageVersion = version + + if purl: + dl.software_packageUrl = purl + + # Add external references + download_location = getattr(dl, 'software_downloadLocation', None) + if download_location and isinstance(download_location, str): + dl.externalRef = dl.externalRef or [] + + if download_location.startswith('git+'): + # VCS reference for Git repositories + git_url = download_location[4:] + if '@' in git_url: + git_url = git_url.split('@')[0] + + dl.externalRef.append( + oe.spdx30.ExternalRef( + externalRefType=oe.spdx30.ExternalRefType.vcs, + locator=[git_url], + ) + ) + elif download_location.startswith(('http://', 'https://', 'ftp://')): + # Distribution reference for tarball/archive downloads + dl.externalRef.append( + oe.spdx30.ExternalRef( + externalRefType=oe.spdx30.ExternalRefType.altDownloadLocation, + locator=[download_location], + ) + ) + + def add_download_files(d, objset): inputs = set() @@ -422,10 +541,14 @@ def add_download_files(d, objset): ) ) + _enrich_source_package(d, dl, fd, file_name, primary_purpose) + if fd.method.supports_checksum(fd): # TODO Need something better than hard coding this for checksum_id in ["sha256", "sha1"]: - expected_checksum = getattr(fd, "%s_expected" % checksum_id, None) + expected_checksum = getattr( + fd, "%s_expected" % checksum_id, None + ) if expected_checksum is None: continue @@ -462,220 +585,6 @@ def set_purposes(d, element, *var_names, force_purposes=[]): ] -def set_purls(spdx_package, purls): - if purls: - spdx_package.software_packageUrl = purls[0] - - for p in sorted(set(purls)): - spdx_package.externalIdentifier.append( - oe.spdx30.ExternalIdentifier( - externalIdentifierType=oe.spdx30.ExternalIdentifierType.packageUrl, - identifier=p, - ) - ) - - -def get_is_native(d): - return bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - - -def create_recipe_spdx(d): - deploydir = Path(d.getVar("SPDXRECIPEDEPLOY")) - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - pn = d.getVar("PN") - - license_data = oe.spdx_common.load_spdx_license_data(d) - - include_vex = d.getVar("SPDX_INCLUDE_VEX") - if not include_vex in ("none", "current", "all"): - bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") - - recipe_objset = oe.sbom30.ObjectSet.new_objset(d, "static-" + pn) - - recipe = recipe_objset.add_root( - oe.spdx30.software_Package( - _id=recipe_objset.new_spdxid("recipe", pn), - creationInfo=recipe_objset.doc.creationInfo, - name=d.getVar("PN"), - software_packageVersion=d.getVar("PV"), - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.specification, - software_sourceInfo=json.dumps( - { - "FILENAME": os.path.basename(d.getVar("FILE")), - "FILE_LAYERNAME": d.getVar("FILE_LAYERNAME"), - }, - separators=(",", ":"), - ), - ) - ) - - if get_is_native(d): - ext = oe.sbom30.OERecipeExtension() - ext.is_native = True - recipe.extension.append(ext) - - set_purls(recipe, (d.getVar("SPDX_PACKAGE_URLS") or "").split()) - - # TODO: This doesn't work before do_unpack because the license text has to - # be available for recipes with NO_GENERIC_LICENSE - # recipe_spdx_license = add_license_expression( - # d, - # recipe_objset, - # d.getVar("LICENSE"), - # license_data, - # ) - # recipe_objset.new_relationship( - # [recipe], - # oe.spdx30.RelationshipType.hasDeclaredLicense, - # [oe.sbom30.get_element_link_id(recipe_spdx_license)], - # ) - - if val := d.getVar("HOMEPAGE"): - recipe.software_homePage = val - - if val := d.getVar("SUMMARY"): - recipe.summary = val - - if val := d.getVar("DESCRIPTION"): - recipe.description = val - - for cpe_id in oe.cve_check.get_cpe_ids( - d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION") - ): - recipe.externalIdentifier.append( - oe.spdx30.ExternalIdentifier( - externalIdentifierType=oe.spdx30.ExternalIdentifierType.cpe23, - identifier=cpe_id, - ) - ) - - direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_recipe_spdx") - - dep_objsets, dep_recipes = collect_dep_objsets( - d, direct_deps, "static", "static-", oe.spdx30.software_Package - ) - - if dep_recipes: - recipe_objset.new_scoped_relationship( - [recipe], - oe.spdx30.RelationshipType.dependsOn, - oe.spdx30.LifecycleScopeType.build, - sorted(oe.sbom30.get_element_link_id(dep) for dep in dep_recipes), - ) - - # Add CVEs - cve_by_status = {} - if include_vex != "none": - patched_cves = oe.cve_check.get_patched_cves(d) - for cve, patched_cve in patched_cves.items(): - mapping = patched_cve["abbrev-status"] - detail = patched_cve["status"] - description = patched_cve.get("justification", None) - resources = patched_cve.get("resource", []) - - # If this CVE is fixed upstream, skip it unless all CVEs are - # specified. - if include_vex != "all" and detail in ( - "fixed-version", - "cpe-stable-backport", - ): - bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) - continue - - spdx_cve = recipe_objset.new_cve_vuln(cve) - - cve_by_status.setdefault(mapping, {})[cve] = ( - spdx_cve, - detail, - description, - resources, - ) - - all_cves = set() - for status, cves in cve_by_status.items(): - for cve, items in cves.items(): - spdx_cve, detail, description, resources = items - spdx_cve_id = oe.sbom30.get_element_link_id(spdx_cve) - - all_cves.add(spdx_cve) - - if status == "Patched": - spdx_vex = recipe_objset.new_vex_patched_relationship( - [spdx_cve_id], [recipe] - ) - patches = [] - for idx, filepath in enumerate(resources): - patches.append( - recipe_objset.new_file( - recipe_objset.new_spdxid( - "patch", str(idx), os.path.basename(filepath) - ), - os.path.basename(filepath), - filepath, - purposes=[oe.spdx30.software_SoftwarePurpose.patch], - hashfile=os.path.isfile(filepath), - ) - ) - - if patches: - recipe_objset.new_scoped_relationship( - spdx_vex, - oe.spdx30.RelationshipType.patchedBy, - oe.spdx30.LifecycleScopeType.build, - patches, - ) - - elif status == "Unpatched": - recipe_objset.new_vex_unpatched_relationship([spdx_cve_id], [recipe]) - elif status == "Ignored": - spdx_vex = recipe_objset.new_vex_ignored_relationship( - [spdx_cve_id], - [recipe], - impact_statement=description, - ) - - vex_just_type = d.getVarFlag("CVE_CHECK_VEX_JUSTIFICATION", detail) - if vex_just_type: - if ( - vex_just_type - not in oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS - ): - bb.fatal( - f"Unknown vex justification '{vex_just_type}', detail '{detail}', for ignored {cve}" - ) - - for v in spdx_vex: - v.security_justificationType = ( - oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS[ - vex_just_type - ] - ) - - elif status == "Unknown": - bb.note(f"Skipping {cve} with status 'Unknown'") - else: - bb.fatal(f"Unknown {cve} status '{status}'") - - if all_cves: - recipe_objset.new_relationship( - [recipe], - oe.spdx30.RelationshipType.hasAssociatedVulnerability, - sorted(list(all_cves)), - ) - - oe.sbom30.write_recipe_jsonld_doc(d, recipe_objset, "static", deploydir) - - -def load_recipe_spdx(d): - - return oe.sbom30.find_root_obj_in_jsonld( - d, - "static", - "static-" + d.getVar("PN"), - oe.spdx30.software_Package, - ) - - def create_spdx(d): def set_var_field(var, obj, name, package=None): val = None @@ -690,17 +599,19 @@ def create_spdx(d): license_data = oe.spdx_common.load_spdx_license_data(d) - pn = d.getVar("PN") deploydir = Path(d.getVar("SPDXDEPLOY")) deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) spdx_workdir = Path(d.getVar("SPDXWORK")) include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" pkg_arch = d.getVar("SSTATE_PKGARCH") - is_native = get_is_native(d) - - recipe, recipe_objset = load_recipe_spdx(d) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( + "cross", d + ) + include_vex = d.getVar("SPDX_INCLUDE_VEX") + if not include_vex in ("none", "current", "all"): + bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") - build_objset = oe.sbom30.ObjectSet.new_objset(d, "build-" + pn) + build_objset = oe.sbom30.ObjectSet.new_objset(d, "recipe-" + d.getVar("PN")) build = build_objset.new_task_build("recipe", "recipe") build_objset.set_element_alias(build) @@ -718,13 +629,47 @@ def create_spdx(d): build_inputs = set() + # Add CVEs + cve_by_status = {} + if include_vex != "none": + patched_cves = oe.cve_check.get_patched_cves(d) + for cve, patched_cve in patched_cves.items(): + decoded_status = { + "mapping": patched_cve["abbrev-status"], + "detail": patched_cve["status"], + "description": patched_cve.get("justification", None) + } + + # If this CVE is fixed upstream, skip it unless all CVEs are + # specified. + if ( + include_vex != "all" + and "detail" in decoded_status + and decoded_status["detail"] + in ( + "fixed-version", + "cpe-stable-backport", + ) + ): + bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) + continue + + spdx_cve = build_objset.new_cve_vuln(cve) + build_objset.set_element_alias(spdx_cve) + + cve_by_status.setdefault(decoded_status["mapping"], {})[cve] = ( + spdx_cve, + decoded_status["detail"], + decoded_status["description"], + ) + cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) source_files = add_download_files(d, build_objset) build_inputs |= source_files recipe_spdx_license = add_license_expression( - d, build_objset, d.getVar("LICENSE"), license_data, [recipe_objset] + d, build_objset, d.getVar("LICENSE"), license_data ) build_objset.new_relationship( source_files, @@ -737,7 +682,7 @@ def create_spdx(d): bb.debug(1, "Adding source files to SPDX") oe.spdx_common.get_patched_src(d) - files = add_package_files( + files, _ = add_package_files( d, build_objset, spdx_workdir, @@ -753,12 +698,7 @@ def create_spdx(d): build_inputs |= files index_sources_by_hash(files, dep_sources) - direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") - - dep_objsets, dep_builds = collect_dep_objsets( - d, direct_deps, "builds", "build-", oe.spdx30.build_Build - ) - + dep_objsets, dep_builds = collect_dep_objsets(d, build) if dep_builds: build_objset.new_scoped_relationship( [build], @@ -828,7 +768,16 @@ def create_spdx(d): or "" ).split() - set_purls(spdx_package, purls) + if purls: + spdx_package.software_packageUrl = purls[0] + + for p in sorted(set(purls)): + spdx_package.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.packageUrl, + identifier=p, + ) + ) pkg_objset.new_scoped_relationship( [oe.sbom30.get_element_link_id(build)], @@ -837,13 +786,6 @@ def create_spdx(d): [spdx_package], ) - pkg_objset.new_scoped_relationship( - [oe.sbom30.get_element_link_id(recipe)], - oe.spdx30.RelationshipType.generates, - oe.spdx30.LifecycleScopeType.build, - [spdx_package], - ) - for cpe_id in cpe_ids: spdx_package.externalIdentifier.append( oe.spdx30.ExternalIdentifier( @@ -877,11 +819,7 @@ def create_spdx(d): package_license = d.getVar("LICENSE:%s" % package) if package_license and package_license != d.getVar("LICENSE"): package_spdx_license = add_license_expression( - d, - build_objset, - package_license, - license_data, - [recipe_objset], + d, build_objset, package_license, license_data ) else: package_spdx_license = recipe_spdx_license @@ -894,9 +832,7 @@ def create_spdx(d): # Add concluded license relationship if manually set # Only add when license analysis has been explicitly performed - concluded_license_str = d.getVar( - "SPDX_CONCLUDED_LICENSE:%s" % package - ) or d.getVar("SPDX_CONCLUDED_LICENSE") + concluded_license_str = d.getVar("SPDX_CONCLUDED_LICENSE:%s" % package) or d.getVar("SPDX_CONCLUDED_LICENSE") if concluded_license_str: concluded_spdx_license = add_license_expression( d, build_objset, concluded_license_str, license_data @@ -908,8 +844,61 @@ def create_spdx(d): [oe.sbom30.get_element_link_id(concluded_spdx_license)], ) + # NOTE: CVE Elements live in the recipe collection + all_cves = set() + for status, cves in cve_by_status.items(): + for cve, items in cves.items(): + spdx_cve, detail, description = items + spdx_cve_id = oe.sbom30.get_element_link_id(spdx_cve) + + all_cves.add(spdx_cve_id) + + if status == "Patched": + pkg_objset.new_vex_patched_relationship( + [spdx_cve_id], [spdx_package] + ) + elif status == "Unpatched": + pkg_objset.new_vex_unpatched_relationship( + [spdx_cve_id], [spdx_package] + ) + elif status == "Ignored": + spdx_vex = pkg_objset.new_vex_ignored_relationship( + [spdx_cve_id], + [spdx_package], + impact_statement=description, + ) + + vex_just_type = d.getVarFlag( + "CVE_CHECK_VEX_JUSTIFICATION", detail + ) + if vex_just_type: + if ( + vex_just_type + not in oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS + ): + bb.fatal( + f"Unknown vex justification '{vex_just_type}', detail '{detail}', for ignored {cve}" + ) + + for v in spdx_vex: + v.security_justificationType = oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS[ + vex_just_type + ] + + elif status == "Unknown": + bb.note(f"Skipping {cve} with status 'Unknown'") + else: + bb.fatal(f"Unknown {cve} status '{status}'") + + if all_cves: + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.hasAssociatedVulnerability, + sorted(list(all_cves)), + ) + bb.debug(1, "Adding package files to SPDX for package %s" % pkg_name) - package_files = add_package_files( + package_files, excluded_files = add_package_files( d, pkg_objset, pkgdest / package, @@ -932,7 +921,8 @@ def create_spdx(d): if include_sources: debug_sources = get_package_sources_from_debug( - d, package, package_files, dep_sources, source_hash_cache + d, package, package_files, dep_sources, source_hash_cache, + excluded_files=excluded_files, ) debug_source_ids |= set( oe.sbom30.get_element_link_id(d) for d in debug_sources @@ -944,7 +934,7 @@ def create_spdx(d): if include_sources: bb.debug(1, "Adding sysroot files to SPDX") - sysroot_files = add_package_files( + sysroot_files, _ = add_package_files( d, build_objset, d.expand("${COMPONENTS_DIR}/${PACKAGE_ARCH}/${PN}"), @@ -985,27 +975,27 @@ def create_spdx(d): status = "enabled" if feature in enabled else "disabled" build.build_parameter.append( oe.spdx30.DictionaryEntry( - key=f"PACKAGECONFIG:{feature}", value=status + key=f"PACKAGECONFIG:{feature}", + value=status ) ) - bb.note( - f"Added PACKAGECONFIG entries: {len(enabled)} enabled, {len(disabled)} disabled" - ) + bb.note(f"Added PACKAGECONFIG entries: {len(enabled)} enabled, {len(disabled)} disabled") - oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "builds", deploydir) + oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "recipes", deploydir) def create_package_spdx(d): deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( + "cross", d + ) - direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") - - providers = oe.spdx_common.collect_package_providers(d, direct_deps) + providers = oe.spdx_common.collect_package_providers(d) pkg_arch = d.getVar("SSTATE_PKGARCH") - if get_is_native(d): + if is_native: return bb.build.exec_func("read_subpackage_metadata", d) @@ -1179,15 +1169,14 @@ def write_bitbake_spdx(d): def collect_build_package_inputs(d, objset, build, packages, files_by_hash=None): import oe.sbom30 - direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") - - providers = oe.spdx_common.collect_package_providers(d, direct_deps) + providers = oe.spdx_common.collect_package_providers(d) build_deps = set() + missing_providers = set() for name in sorted(packages.keys()): if name not in providers: - bb.note(f"Unable to find SPDX provider for '{name}'") + missing_providers.add(name) continue pkg_name, pkg_hashfn = providers[name] @@ -1206,6 +1195,11 @@ def collect_build_package_inputs(d, objset, build, packages, files_by_hash=None) for h, f in pkg_objset.by_sha256_hash.items(): files_by_hash.setdefault(h, set()).update(f) + if missing_providers: + bb.fatal( + f"Unable to find SPDX provider(s) for: {', '.join(sorted(missing_providers))}" + ) + if build_deps: objset.new_scoped_relationship( [build], @@ -1326,18 +1320,18 @@ def create_image_spdx(d): image_filename = image["filename"] image_path = image_deploy_dir / image_filename if os.path.isdir(image_path): - a = add_package_files( - d, - objset, - image_path, - lambda file_counter: objset.new_spdxid( - "imagefile", str(file_counter) - ), - lambda filepath: [], - license_data=None, - ignore_dirs=[], - ignore_top_level_dirs=[], - archive=None, + a, _ = add_package_files( + d, + objset, + image_path, + lambda file_counter: objset.new_spdxid( + "imagefile", str(file_counter) + ), + lambda filepath: [], + license_data=None, + ignore_dirs=[], + ignore_top_level_dirs=[], + archive=None, ) artifacts.extend(a) else: @@ -1364,6 +1358,7 @@ def create_image_spdx(d): set_timestamp_now(d, a, "builtTime") + if artifacts: objset.new_scoped_relationship( [image_build], @@ -1423,6 +1418,16 @@ def create_image_sbom_spdx(d): objset, sbom = oe.sbom30.create_sbom(d, image_name, root_elements) + # Set supplier on root elements if SPDX_IMAGE_SUPPLIER is defined + supplier = objset.new_agent("SPDX_IMAGE_SUPPLIER", add=False) + if supplier is not None: + supplier_id = supplier if isinstance(supplier, str) else supplier._id + if not isinstance(supplier, str): + objset.add(supplier) + for elem in sbom.rootElement: + if hasattr(elem, "suppliedBy"): + elem.suppliedBy = supplier_id + oe.sbom30.write_jsonld_doc(d, objset, spdx_path) def make_image_link(target_path, suffix): @@ -1534,16 +1539,16 @@ def create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, toolchain_outputname): d, toolchain_outputname, sorted(list(files)), [rootfs_objset] ) + # Set supplier on root elements if SPDX_SDK_SUPPLIER is defined + supplier = objset.new_agent("SPDX_SDK_SUPPLIER", add=False) + if supplier is not None: + supplier_id = supplier if isinstance(supplier, str) else supplier._id + if not isinstance(supplier, str): + objset.add(supplier) + for elem in sbom.rootElement: + if hasattr(elem, "suppliedBy"): + elem.suppliedBy = supplier_id + oe.sbom30.write_jsonld_doc( d, objset, sdk_deploydir / (toolchain_outputname + ".spdx.json") ) - - -def create_recipe_sbom(d, deploydir): - sbom_name = d.getVar("SPDX_RECIPE_SBOM_NAME") - - recipe, recipe_objset = load_recipe_spdx(d) - - objset, sbom = oe.sbom30.create_sbom(d, sbom_name, [recipe], [recipe_objset]) - - oe.sbom30.write_jsonld_doc(d, objset, deploydir / (sbom_name + ".spdx.json")) From patchwork Sat Mar 21 13:18:24 2026 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Stefano Tondo X-Patchwork-Id: 84044 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id C4C041094478 for ; Sat, 21 Mar 2026 13:18:36 +0000 (UTC) Received: from mail-wr1-f48.google.com (mail-wr1-f48.google.com [209.85.221.48]) by mx.groups.io with SMTP id smtpd.msgproc01-g2.10732.1774099114288405618 for ; Sat, 21 Mar 2026 06:18:34 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=k53RiW2m; spf=pass (domain: gmail.com, ip: 209.85.221.48, mailfrom: stondo@gmail.com) Received: by mail-wr1-f48.google.com with SMTP id ffacd0b85a97d-43b4d734678so2890837f8f.1 for ; Sat, 21 Mar 2026 06:18:34 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1774099112; x=1774703912; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=L+o5I3bLWzC/ut6XgrArrL2FIfC37GhFSFvHtXDg7ew=; b=k53RiW2mKpYjIJCpujyzzvu2swaNxp7avXU7nmbtp1UIGl1446WTpRZ28pvpeG/YNv thREN5D8jgp33WyUS4bySAunXS5ZpDPt5Jj61D9koFvvyo6kYFTu03RVfgP9t9joAipD WH1kBPkpxdtGhZV7ecoQcA/RA3Ub8QQvTZoEmE2Ufozj0K3WLi62xHpjfHv2JUy8bid2 Cn1IW/KsDjv2BSIur+wDGKByuG4/Ev7pMCB9DSQGLiybRb/JgmNC83qsGD6e6d1NrrDC abrATEHpPs7XGs8I9b+Es80XOztH3zMLtxMEPXKaPYAZx4+4fY+EcK5HUvedlIhY/4ta jh0w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20251104; t=1774099112; x=1774703912; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-gg:x-gm-message-state:from :to:cc:subject:date:message-id:reply-to; bh=L+o5I3bLWzC/ut6XgrArrL2FIfC37GhFSFvHtXDg7ew=; b=aBtOZfq+FKPbhv46NIJQJrUYqg+z7+9UjcFvfJAzxEevuro8sD6o2OrHO7ravIGBNX OCYHqfM5sT+bfIyIOL8uCIaC4xGAQHrXRYbzQxHvqX/SFxgfYSvIz/+zgzy4qlLFiBVO 3ItGS26aaBNxgVtl/DUfNJNQf4eJylTw0Z5p3uAskgvTCr6+77n8uxnSZPEuDPGDXwKe wCNj5j7PDdzQVMwKxhUbAYKr52qAx8bpR1/81oSWJAqaaF8/rju25q0QM2Kd1eaPdPrH Xes7QHhU9xbqEJpwGMyIsfQL7NpEHrhPKiYpbOTJNzyMFmNcShr6G8ODgAM95WnBzPKH YvbQ== X-Gm-Message-State: AOJu0YydXSjS8okqknAx83umD4cXi5HqdoinavHL5i04NfRBEXYQwpLO Zl8Qssqr3BOFEEBxmD1Y5tXsJ3M1KeLmUz+BY+2P4z45XOR9fl6IpXpKvP0wBkjp X-Gm-Gg: ATEYQzyF27OunOB0WyiPg5M1zQqMmg6Xxqag5lhLdFDfWkCikeP8xfNSoS4Qi3PE+bn gYLhckDU3QiCqTBRejw+U1Dq+qXw1KjAw75Rm7qXnmJT7B6bz5rUDSgFTTNQeVeM2SZCfrCl/Y7 IO64nnkb/PPfXhGjx77yrcS8CU4AnPpYKPm9rYWqmh+ZXtmUifAvfU+KgTwUfFI1Ci2DZOqdMg2 rVqKB7oUIL8IJlw0+7wOag4pYRtiMAH4Uuug7tWo09KUamZcFeWxzIlse2IiOIv8qHVGgXaN50f H0YEy++ZXX7DPR+U7JPCeJ7qAFmAryI+oKMx90Wu64bIXWKR26/mT/GK+Jg1UWRHYuVofxr/mYS y4qDNIU51mlqKyL0nrxScHii8C+CCsrCnirbg7ARuFHwGGudlgn7oJoZn7QGOCNB7e2e49SEAkp FTxJF2HWfXe591gBkR2AGZ+ko+yoQixNU4f7M= X-Received: by 2002:a05:6000:4381:b0:439:fd13:5c4a with SMTP id ffacd0b85a97d-43b64234891mr10292697f8f.6.1774099112105; Sat, 21 Mar 2026 06:18:32 -0700 (PDT) Received: from fedora ([81.6.40.67]) by smtp.gmail.com with ESMTPSA id ffacd0b85a97d-43b644bf1c5sm15004285f8f.14.2026.03.21.06.18.30 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sat, 21 Mar 2026 06:18:31 -0700 (PDT) From: stondo@gmail.com To: openembedded-core@lists.openembedded.org Cc: Ross.Burton@arm.com, jpewhacker@gmail.com, stefano.tondo.ext@siemens.com, Peter.Marko@siemens.com, adrian.freihofer@siemens.com, mathieu.dubois-briand@bootlin.com, Joshua Watt Subject: [OE-core][PATCH v11 2/4] spdx30: Add supplier support for image and SDK SBOMs Date: Sat, 21 Mar 2026 14:18:24 +0100 Message-ID: <20260321131826.1401671-3-stondo@gmail.com> X-Mailer: git-send-email 2.53.0 In-Reply-To: <20260321131826.1401671-1-stondo@gmail.com> References: <20260321131826.1401671-1-stondo@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from 45-33-107-173.ip.linodeusercontent.com [45.33.107.173] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Sat, 21 Mar 2026 13:18:36 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/233659 From: Stefano Tondo Add SPDX_IMAGE_SUPPLIER and SPDX_SDK_SUPPLIER variables that allow setting a supplier agent on image and SDK SBOM root elements using the suppliedBy property. These follow the existing SPDX_PACKAGE_SUPPLIER pattern and use the standard agent variable system to define supplier information. Signed-off-by: Stefano Tondo Reviewed-by: Joshua Watt --- meta/classes/create-spdx-3.0.bbclass | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass index 7515f460c3..9a6606dce6 100644 --- a/meta/classes/create-spdx-3.0.bbclass +++ b/meta/classes/create-spdx-3.0.bbclass @@ -124,6 +124,16 @@ SPDX_ON_BEHALF_OF[doc] = "The base variable name to describe the Agent on who's SPDX_PACKAGE_SUPPLIER[doc] = "The base variable name to describe the Agent who \ is supplying artifacts produced by the build" +SPDX_IMAGE_SUPPLIER[doc] = "The base variable name to describe the Agent who \ + is supplying the image SBOM. The supplier will be set on all root elements \ + of the image SBOM using the suppliedBy property. If not set, no supplier \ + information will be added to the image SBOM." + +SPDX_SDK_SUPPLIER[doc] = "The base variable name to describe the Agent who \ + is supplying the SDK SBOM. The supplier will be set on all root elements \ + of the SDK SBOM using the suppliedBy property. If not set, no supplier \ + information will be added to the SDK SBOM." + SPDX_PACKAGE_VERSION ??= "${PV}" SPDX_PACKAGE_VERSION[doc] = "The version of a package, software_packageVersion \ in software_Package" From patchwork Sat Mar 21 13:18:25 2026 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Stefano Tondo X-Patchwork-Id: 84045 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id B7D651094476 for ; Sat, 21 Mar 2026 13:18:36 +0000 (UTC) Received: from mail-wr1-f51.google.com (mail-wr1-f51.google.com [209.85.221.51]) by mx.groups.io with SMTP id smtpd.msgproc02-g2.10766.1774099115610365768 for ; Sat, 21 Mar 2026 06:18:36 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=bqzwftOO; spf=pass (domain: gmail.com, ip: 209.85.221.51, mailfrom: stondo@gmail.com) Received: by mail-wr1-f51.google.com with SMTP id ffacd0b85a97d-43b49819938so803446f8f.0 for ; Sat, 21 Mar 2026 06:18:35 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1774099114; x=1774703914; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=zaInCMYYN0xEqBOJ781k0syoZWuRdWjZjH73JISdHwA=; b=bqzwftOOtHSc87NmQRcyjlFnO9exy2CtdlcQTcyMzleyAz+DeHMrV+W8OfrhN3t3P0 8/T5YlsZaAxH8TqitIygwQRxqlSiXUUiQ+qqeyeKoQvggDg1eUIqz1b1kqLt63gqwz5f 2kNuoOpcO0MDifqPSxD6cAmwAzbPaZTBnFfCK2ZL1LzMDYghNG9FwLZmiJJncA+YKYwj 5pzqftTgkj9sFao3e0F5Oa4CX9NfeGsX2GXFHX0vuB5KXC0WlTM3oUp3Mc76Uf0erXTx goUIkhS6TdASVNjmYEbVbYVFul5r2fXCa2IXxMGhRlCiMOgN5fx+UgnsEZYUQqp74Grx a+vQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20251104; t=1774099114; x=1774703914; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-gg:x-gm-message-state:from :to:cc:subject:date:message-id:reply-to; bh=zaInCMYYN0xEqBOJ781k0syoZWuRdWjZjH73JISdHwA=; b=mUZS19QqXSdj2dWvq5AmOPUYVQByQq6SM+fNJNaFLidqbdsLFsJqRs1XTZlw22zxmY 4X+CoLxPAW6PUCMfgibpjq9pP1yfTCPDgfmAcw2/ycQZ+0Gi5gEJ0wZE/y6x9mshb/Ej MENhOPtmgL1vea/SY2zcKPhfW9Kisx5W4vy4dj/3X7UNKA91kN7Sh/dVVB361ndJpY7m v1fkH180W4edTIUf1ZZm1WYFae0LBaQKzXx5A5WpcGX7b0qPyyB0YLwx4dbTcR+dtVFI qIZuCrMIWGEf4T6Mf4qvxkX+n25TZ2bLV3jgDueiO89cbLHBvQwa3Z1s88AmxXUGbmQE dX7w== X-Gm-Message-State: AOJu0YzQAxGT+nlP6W7g/ZcbewpyiXDTNBIi4u/nzbpiXxzdrWcJiGBY AKQr3tjRKXx1kWSSAHlEoDat0jPYyjCc0l6OwGncWYDTzJLt19uHVAVaH2Qx93xQ X-Gm-Gg: ATEYQzyjAr3JoUpL2PX3Ky/CxYU6kEv8sw1xLMfeHJ+GLlgIrIC8ug7026kiBLI7Opl PjJ0gRaf3hQJ4gGh7y91r41/fqX+Eef/XWh7kQh2OFvfbhi9a32T2TJD2JfORwng9GI6oLg3DkQ RieJaskVk1NSTB2NOK000leS346leAyTGjMh6sJaYFbQ9vSSxSM1v1H2OMBKLAKfgaFl3C4pCXN UesHdS+gdb4R9wtvBiJ2yPAgMBYs1r90pmyc53sVpCwWi06XcjY9YUmlPlu0MPhF6aZ594Rg9X0 L+1PG9anKeQMJLxml6zIZbY4cUuN9d3Og8bytSuBPtAAwR0DNobqvjDshPWorF5mNIsqy8vD7lA GKDYuK7y7lcz7v+CUCmahOLddK35n93lc5xcpM1gwYre6zeFm6KTuPvly+bjJHKKhCHeMcjIgBM JO16Tcqme9S/L/gzRU51WmQzXGEgARy04WXZc= X-Received: by 2002:a05:6000:40c7:b0:43b:4f0c:aefe with SMTP id ffacd0b85a97d-43b6423fc08mr9615071f8f.6.1774099113475; Sat, 21 Mar 2026 06:18:33 -0700 (PDT) Received: from fedora ([81.6.40.67]) by smtp.gmail.com with ESMTPSA id ffacd0b85a97d-43b644bf1c5sm15004285f8f.14.2026.03.21.06.18.32 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sat, 21 Mar 2026 06:18:32 -0700 (PDT) From: stondo@gmail.com To: openembedded-core@lists.openembedded.org Cc: Ross.Burton@arm.com, jpewhacker@gmail.com, stefano.tondo.ext@siemens.com, Peter.Marko@siemens.com, adrian.freihofer@siemens.com, mathieu.dubois-briand@bootlin.com Subject: [OE-core][PATCH v11 3/4] spdx30: Enrich source downloads with version and PURL Date: Sat, 21 Mar 2026 14:18:25 +0100 Message-ID: <20260321131826.1401671-4-stondo@gmail.com> X-Mailer: git-send-email 2.53.0 In-Reply-To: <20260321131826.1401671-1-stondo@gmail.com> References: <20260321131826.1401671-1-stondo@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from 45-33-107-173.ip.linodeusercontent.com [45.33.107.173] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Sat, 21 Mar 2026 13:18:36 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/233660 From: Stefano Tondo Add version extraction, PURL generation, and external references to source download packages in SPDX 3.0 SBOMs: - Extract version from SRCREV for Git sources (full SHA-1) - Generate PURLs for Git sources on github.com by default - Support custom mappings via SPDX_GIT_PURL_MAPPINGS variable (format: "domain:purl_type", split(':', 1) for parsing) - Use ecosystem PURLs from SPDX_PACKAGE_URLS for non-Git - Add VCS external references for Git downloads - Add distribution external references for tarball downloads - Parse Git URLs using urllib.parse - Extract logic into _generate_git_purl() and _enrich_source_package() helpers For non-Git sources, version is not set from PV since the recipe version does not necessarily reflect the version of individual downloaded files. Ecosystem PURLs (which include version) from SPDX_PACKAGE_URLS are still used when available. The SPDX_GIT_PURL_MAPPINGS variable allows configuring PURL generation for self-hosted Git services (e.g., GitLab). github.com is always mapped to pkg:github by default. Signed-off-by: Stefano Tondo --- meta/classes/create-spdx-3.0.bbclass | 7 + meta/lib/oe/spdx30_tasks.py | 444 +++++++++++++++++---------- 2 files changed, 293 insertions(+), 158 deletions(-) diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass index 9a6606dce6..265dc525bc 100644 --- a/meta/classes/create-spdx-3.0.bbclass +++ b/meta/classes/create-spdx-3.0.bbclass @@ -156,6 +156,13 @@ SPDX_RECIPE_SBOM_NAME ?= "${PN}-recipe-sbom" SPDX_RECIPE_SBOM_NAME[doc] = "The name of output recipe SBoM when using \ create_recipe_sbom" +SPDX_GIT_PURL_MAPPINGS ??= "" +SPDX_GIT_PURL_MAPPINGS[doc] = "A space separated list of domain:purl_type \ + mappings to configure PURL generation for Git source downloads. \ + For example, "gitlab.example.com:pkg:gitlab" maps repositories hosted \ + on gitlab.example.com to the pkg:gitlab PURL type. \ + github.com is always mapped to pkg:github by default." + IMAGE_CLASSES:append = " create-spdx-image-3.0" SDK_CLASSES += "create-spdx-sdk-3.0" diff --git a/meta/lib/oe/spdx30_tasks.py b/meta/lib/oe/spdx30_tasks.py index b94868dc87..1968586dd5 100644 --- a/meta/lib/oe/spdx30_tasks.py +++ b/meta/lib/oe/spdx30_tasks.py @@ -34,7 +34,9 @@ def set_timestamp_now(d, o, prop): delattr(o, prop) -def add_license_expression(d, objset, license_expression, license_data): +def add_license_expression( + d, objset, license_expression, license_data, search_objsets=[] +): simple_license_text = {} license_text_map = {} license_ref_idx = 0 @@ -46,14 +48,15 @@ def add_license_expression(d, objset, license_expression, license_data): if name in simple_license_text: return simple_license_text[name] - lic = objset.find_filter( - oe.spdx30.simplelicensing_SimpleLicensingText, - name=name, - ) + for o in [objset] + search_objsets: + lic = o.find_filter( + oe.spdx30.simplelicensing_SimpleLicensingText, + name=name, + ) - if lic is not None: - simple_license_text[name] = lic - return lic + if lic is not None: + simple_license_text[name] = lic + return lic lic = objset.add( oe.spdx30.simplelicensing_SimpleLicensingText( @@ -147,37 +150,58 @@ def add_package_files( ignore_dirs=[], ignore_top_level_dirs=[], ): - if os.path.isdir(image_path): - a, _ = add_package_files( - d, - objset, - image_path, - lambda file_counter: objset.new_spdxid( - "imagefile", str(file_counter) - ), - lambda filepath: [], - license_data=None, - ignore_dirs=[], - ignore_top_level_dirs=[], - archive=None, - ) + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + if source_date_epoch: + source_date_epoch = int(source_date_epoch) + + spdx_files = set() + + file_counter = 1 + if not os.path.exists(topdir): + bb.note(f"Skip {topdir}") + return spdx_files, set() + + check_compiled_sources = d.getVar("SPDX_INCLUDE_COMPILED_SOURCES") == "1" + if check_compiled_sources: + compiled_sources, types = oe.spdx_common.get_compiled_sources(d) + bb.debug(1, f"Total compiled files: {len(compiled_sources)}") + + exclude_patterns = [ + re.compile(pattern) + for pattern in (d.getVar("SPDX_FILE_EXCLUDE_PATTERNS") or "").split() + ] + excluded_files = set() + + for subdir, dirs, files in os.walk(topdir, onerror=walk_error): + dirs[:] = [directory for directory in dirs if directory not in ignore_dirs] + if subdir == str(topdir): + dirs[:] = [ + directory + for directory in dirs + if directory not in ignore_top_level_dirs + ] + + dirs.sort() + files.sort() + for file in files: + filepath = Path(subdir) / file if filepath.is_symlink() or not filepath.is_file(): continue filename = str(filepath.relative_to(topdir)) - # Apply file exclusion filtering - if exclude_patterns: - if any(p.search(filename) for p in exclude_patterns): - excluded_files.add(filename) - continue + if exclude_patterns and any( + pattern.search(filename) for pattern in exclude_patterns + ): + excluded_files.add(filename) + continue file_purposes = get_purposes(filepath) - # Check if file is compiled - if check_compiled_sources: - if not oe.spdx_common.is_compiled_source(filename, compiled_sources, types): - continue + if check_compiled_sources and not oe.spdx_common.is_compiled_source( + filename, compiled_sources, types + ): + continue spdx_file = objset.new_file( get_spdxid(file_counter), @@ -300,17 +324,14 @@ def get_package_sources_from_debug( return dep_source_files -def collect_dep_objsets(d, build): - deps = oe.spdx_common.get_spdx_deps(d) - +def collect_dep_objsets(d, direct_deps, subdir, fn_prefix, obj_type, **attr_filter): dep_objsets = [] - dep_builds = set() + dep_objs = set() - dep_build_spdxids = set() - for dep in deps: + for dep in direct_deps: bb.debug(1, "Fetching SPDX for dependency %s" % (dep.pn)) - dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld( - d, "recipes", "recipe-" + dep.pn, oe.spdx30.build_Build + dep_obj, dep_objset = oe.sbom30.find_root_obj_in_jsonld( + d, subdir, fn_prefix + dep.pn, obj_type, **attr_filter ) # If the dependency is part of the taskhash, return it to be linked # against. Otherwise, it cannot be linked against because this recipe @@ -318,10 +339,10 @@ def collect_dep_objsets(d, build): if dep.in_taskhash: dep_objsets.append(dep_objset) - # The build _can_ be linked against (by alias) - dep_builds.add(dep_build) + # The object _can_ be linked against (by alias) + dep_objs.add(dep_obj) - return dep_objsets, dep_builds + return dep_objsets, dep_objs def index_sources_by_hash(sources, dest): @@ -585,6 +606,201 @@ def set_purposes(d, element, *var_names, force_purposes=[]): ] +def set_purls(spdx_package, purls): + if purls: + spdx_package.software_packageUrl = purls[0] + + for p in sorted(set(purls)): + spdx_package.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.packageUrl, + identifier=p, + ) + ) + + +def get_is_native(d): + return bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) + + +def create_recipe_spdx(d): + deploydir = Path(d.getVar("SPDXRECIPEDEPLOY")) + pn = d.getVar("PN") + + license_data = oe.spdx_common.load_spdx_license_data(d) + + include_vex = d.getVar("SPDX_INCLUDE_VEX") + if not include_vex in ("none", "current", "all"): + bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") + + recipe_objset = oe.sbom30.ObjectSet.new_objset(d, "static-" + pn) + + recipe = recipe_objset.add_root( + oe.spdx30.software_Package( + _id=recipe_objset.new_spdxid("recipe", pn), + creationInfo=recipe_objset.doc.creationInfo, + name=d.getVar("PN"), + software_packageVersion=d.getVar("PV"), + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.specification, + software_sourceInfo=json.dumps( + { + "FILENAME": os.path.basename(d.getVar("FILE")), + "FILE_LAYERNAME": d.getVar("FILE_LAYERNAME"), + }, + separators=(",", ":"), + ), + ) + ) + + if get_is_native(d): + ext = oe.sbom30.OERecipeExtension() + ext.is_native = True + recipe.extension.append(ext) + + set_purls(recipe, (d.getVar("SPDX_PACKAGE_URLS") or "").split()) + + if val := d.getVar("HOMEPAGE"): + recipe.software_homePage = val + + if val := d.getVar("SUMMARY"): + recipe.summary = val + + if val := d.getVar("DESCRIPTION"): + recipe.description = val + + for cpe_id in oe.cve_check.get_cpe_ids( + d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION") + ): + recipe.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.cpe23, + identifier=cpe_id, + ) + ) + + direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_recipe_spdx") + + dep_objsets, dep_recipes = collect_dep_objsets( + d, direct_deps, "static", "static-", oe.spdx30.software_Package + ) + + if dep_recipes: + recipe_objset.new_scoped_relationship( + [recipe], + oe.spdx30.RelationshipType.dependsOn, + oe.spdx30.LifecycleScopeType.build, + sorted(oe.sbom30.get_element_link_id(dep) for dep in dep_recipes), + ) + + cve_by_status = {} + if include_vex != "none": + patched_cves = oe.cve_check.get_patched_cves(d) + for cve, patched_cve in patched_cves.items(): + mapping = patched_cve["abbrev-status"] + detail = patched_cve["status"] + description = patched_cve.get("justification", None) + resources = patched_cve.get("resource", []) + + if include_vex != "all" and detail in ( + "fixed-version", + "cpe-stable-backport", + ): + bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) + continue + + spdx_cve = recipe_objset.new_cve_vuln(cve) + + cve_by_status.setdefault(mapping, {})[cve] = ( + spdx_cve, + detail, + description, + resources, + ) + + all_cves = set() + for status, cves in cve_by_status.items(): + for cve, items in cves.items(): + spdx_cve, detail, description, resources = items + spdx_cve_id = oe.sbom30.get_element_link_id(spdx_cve) + + all_cves.add(spdx_cve) + + if status == "Patched": + spdx_vex = recipe_objset.new_vex_patched_relationship( + [spdx_cve_id], [recipe] + ) + patches = [] + for idx, filepath in enumerate(resources): + patches.append( + recipe_objset.new_file( + recipe_objset.new_spdxid( + "patch", str(idx), os.path.basename(filepath) + ), + os.path.basename(filepath), + filepath, + purposes=[oe.spdx30.software_SoftwarePurpose.patch], + hashfile=os.path.isfile(filepath), + ) + ) + + if patches: + recipe_objset.new_scoped_relationship( + spdx_vex, + oe.spdx30.RelationshipType.patchedBy, + oe.spdx30.LifecycleScopeType.build, + patches, + ) + + elif status == "Unpatched": + recipe_objset.new_vex_unpatched_relationship([spdx_cve_id], [recipe]) + elif status == "Ignored": + spdx_vex = recipe_objset.new_vex_ignored_relationship( + [spdx_cve_id], + [recipe], + impact_statement=description, + ) + + vex_just_type = d.getVarFlag("CVE_CHECK_VEX_JUSTIFICATION", detail) + if vex_just_type: + if ( + vex_just_type + not in oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS + ): + bb.fatal( + f"Unknown vex justification '{vex_just_type}', detail '{detail}', for ignored {cve}" + ) + + for v in spdx_vex: + v.security_justificationType = ( + oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS[ + vex_just_type + ] + ) + + elif status == "Unknown": + bb.note(f"Skipping {cve} with status 'Unknown'") + else: + bb.fatal(f"Unknown {cve} status '{status}'") + + if all_cves: + recipe_objset.new_relationship( + [recipe], + oe.spdx30.RelationshipType.hasAssociatedVulnerability, + sorted(list(all_cves)), + ) + + oe.sbom30.write_recipe_jsonld_doc(d, recipe_objset, "static", deploydir) + + +def load_recipe_spdx(d): + return oe.sbom30.find_root_obj_in_jsonld( + d, + "static", + "static-" + d.getVar("PN"), + oe.spdx30.software_Package, + ) + + def create_spdx(d): def set_var_field(var, obj, name, package=None): val = None @@ -599,19 +815,15 @@ def create_spdx(d): license_data = oe.spdx_common.load_spdx_license_data(d) + pn = d.getVar("PN") deploydir = Path(d.getVar("SPDXDEPLOY")) - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) spdx_workdir = Path(d.getVar("SPDXWORK")) include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" - pkg_arch = d.getVar("SSTATE_PKGARCH") - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( - "cross", d - ) - include_vex = d.getVar("SPDX_INCLUDE_VEX") - if not include_vex in ("none", "current", "all"): - bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") + is_native = get_is_native(d) + + recipe, recipe_objset = load_recipe_spdx(d) - build_objset = oe.sbom30.ObjectSet.new_objset(d, "recipe-" + d.getVar("PN")) + build_objset = oe.sbom30.ObjectSet.new_objset(d, "build-" + pn) build = build_objset.new_task_build("recipe", "recipe") build_objset.set_element_alias(build) @@ -629,47 +841,13 @@ def create_spdx(d): build_inputs = set() - # Add CVEs - cve_by_status = {} - if include_vex != "none": - patched_cves = oe.cve_check.get_patched_cves(d) - for cve, patched_cve in patched_cves.items(): - decoded_status = { - "mapping": patched_cve["abbrev-status"], - "detail": patched_cve["status"], - "description": patched_cve.get("justification", None) - } - - # If this CVE is fixed upstream, skip it unless all CVEs are - # specified. - if ( - include_vex != "all" - and "detail" in decoded_status - and decoded_status["detail"] - in ( - "fixed-version", - "cpe-stable-backport", - ) - ): - bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) - continue - - spdx_cve = build_objset.new_cve_vuln(cve) - build_objset.set_element_alias(spdx_cve) - - cve_by_status.setdefault(decoded_status["mapping"], {})[cve] = ( - spdx_cve, - decoded_status["detail"], - decoded_status["description"], - ) - cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) source_files = add_download_files(d, build_objset) build_inputs |= source_files recipe_spdx_license = add_license_expression( - d, build_objset, d.getVar("LICENSE"), license_data + d, build_objset, d.getVar("LICENSE"), license_data, [recipe_objset] ) build_objset.new_relationship( source_files, @@ -698,7 +876,11 @@ def create_spdx(d): build_inputs |= files index_sources_by_hash(files, dep_sources) - dep_objsets, dep_builds = collect_dep_objsets(d, build) + direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") + + dep_objsets, dep_builds = collect_dep_objsets( + d, direct_deps, "builds", "build-", oe.spdx30.build_Build + ) if dep_builds: build_objset.new_scoped_relationship( [build], @@ -768,16 +950,7 @@ def create_spdx(d): or "" ).split() - if purls: - spdx_package.software_packageUrl = purls[0] - - for p in sorted(set(purls)): - spdx_package.externalIdentifier.append( - oe.spdx30.ExternalIdentifier( - externalIdentifierType=oe.spdx30.ExternalIdentifierType.packageUrl, - identifier=p, - ) - ) + set_purls(spdx_package, purls) pkg_objset.new_scoped_relationship( [oe.sbom30.get_element_link_id(build)], @@ -786,6 +959,13 @@ def create_spdx(d): [spdx_package], ) + pkg_objset.new_scoped_relationship( + [oe.sbom30.get_element_link_id(recipe)], + oe.spdx30.RelationshipType.generates, + oe.spdx30.LifecycleScopeType.build, + [spdx_package], + ) + for cpe_id in cpe_ids: spdx_package.externalIdentifier.append( oe.spdx30.ExternalIdentifier( @@ -819,7 +999,11 @@ def create_spdx(d): package_license = d.getVar("LICENSE:%s" % package) if package_license and package_license != d.getVar("LICENSE"): package_spdx_license = add_license_expression( - d, build_objset, package_license, license_data + d, + build_objset, + package_license, + license_data, + [recipe_objset], ) else: package_spdx_license = recipe_spdx_license @@ -844,59 +1028,6 @@ def create_spdx(d): [oe.sbom30.get_element_link_id(concluded_spdx_license)], ) - # NOTE: CVE Elements live in the recipe collection - all_cves = set() - for status, cves in cve_by_status.items(): - for cve, items in cves.items(): - spdx_cve, detail, description = items - spdx_cve_id = oe.sbom30.get_element_link_id(spdx_cve) - - all_cves.add(spdx_cve_id) - - if status == "Patched": - pkg_objset.new_vex_patched_relationship( - [spdx_cve_id], [spdx_package] - ) - elif status == "Unpatched": - pkg_objset.new_vex_unpatched_relationship( - [spdx_cve_id], [spdx_package] - ) - elif status == "Ignored": - spdx_vex = pkg_objset.new_vex_ignored_relationship( - [spdx_cve_id], - [spdx_package], - impact_statement=description, - ) - - vex_just_type = d.getVarFlag( - "CVE_CHECK_VEX_JUSTIFICATION", detail - ) - if vex_just_type: - if ( - vex_just_type - not in oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS - ): - bb.fatal( - f"Unknown vex justification '{vex_just_type}', detail '{detail}', for ignored {cve}" - ) - - for v in spdx_vex: - v.security_justificationType = oe.spdx30.security_VexJustificationType.NAMED_INDIVIDUALS[ - vex_just_type - ] - - elif status == "Unknown": - bb.note(f"Skipping {cve} with status 'Unknown'") - else: - bb.fatal(f"Unknown {cve} status '{status}'") - - if all_cves: - pkg_objset.new_relationship( - [spdx_package], - oe.spdx30.RelationshipType.hasAssociatedVulnerability, - sorted(list(all_cves)), - ) - bb.debug(1, "Adding package files to SPDX for package %s" % pkg_name) package_files, excluded_files = add_package_files( d, @@ -982,20 +1113,17 @@ def create_spdx(d): bb.note(f"Added PACKAGECONFIG entries: {len(enabled)} enabled, {len(disabled)} disabled") - oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "recipes", deploydir) + oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "builds", deploydir) def create_package_spdx(d): deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( - "cross", d - ) - - providers = oe.spdx_common.collect_package_providers(d) + direct_deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") + providers = oe.spdx_common.collect_package_providers(d, direct_deps) pkg_arch = d.getVar("SSTATE_PKGARCH") - if is_native: + if get_is_native(d): return bb.build.exec_func("read_subpackage_metadata", d) From patchwork Sat Mar 21 13:18:26 2026 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Stefano Tondo X-Patchwork-Id: 84047 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id E2D111094478 for ; Sat, 21 Mar 2026 13:18:46 +0000 (UTC) Received: from mail-wr1-f45.google.com (mail-wr1-f45.google.com [209.85.221.45]) by mx.groups.io with SMTP id smtpd.msgproc01-g2.10734.1774099117073826440 for ; Sat, 21 Mar 2026 06:18:37 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=hhi0wO4t; spf=pass (domain: gmail.com, ip: 209.85.221.45, mailfrom: stondo@gmail.com) Received: by mail-wr1-f45.google.com with SMTP id ffacd0b85a97d-439b9b1900bso1034372f8f.1 for ; Sat, 21 Mar 2026 06:18:36 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1774099115; x=1774703915; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=EJGjlZk+lQ6hqbjoAMdFdAu77PRpfbXHdukv1GxGEv4=; b=hhi0wO4t7ddffw9Tf0ESm0EfwQeMNtISrZLkKgBFztp+gcyjABUa/wC5BHyxIzjKzk 3qlgzWtG0cvhGUtTsaqeZqyVLUnQC6cHTdG2F1OIy9+Q6o6aNrnLEQHxHLyYaT4j++bj zPa12LjZ1BgMAV/lFvOvm+NOgA4AkUeqTMEapH/8qHvhqooPdpYaNRobAidw0i5yVegl CWs5OUcw56Za6wgl51UAd4o8leM/UFedBWjM8R1YS6kOPmkiZ4fzKcNO0XXraRVBE6ni jwO2vrwXN42mUrm6aeFHCBpqfyXEEF2tBpLnVH0AuhfMjk0d9g9wcI8YVxRTZEb9Pa3L cQOg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20251104; t=1774099115; x=1774703915; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-gg:x-gm-message-state:from :to:cc:subject:date:message-id:reply-to; bh=EJGjlZk+lQ6hqbjoAMdFdAu77PRpfbXHdukv1GxGEv4=; b=AMeKZT9EQah9PyrQ9UQaH5CxhlIZenKw736Bw9Acyxw02TqRD8JNI0PFpPK8GY0sqj /CZlHz/uduv7OY4phEQMnh4O7F2ppSYO7b59hiSYfd7edmORzaBgthivfLZfoSVwDqFO FuFcMg95e3yBt49ETQPNods/0msKt6kiJr2FEmr6bwquouIPqq1hCCUqjvaduOhkyyWJ 2iQ035msZIumpHH9aszXyWKBFEk/PQjNELFy8sEufh2m5SZWENBSsf6kil2rdoPxKpOs yKG0X93LdmaVao9/5DJdOuOHo17xI+UroKc6vNPLE/Xig3xdSJqcKMYArrgXjtwIEHgY U77g== X-Gm-Message-State: AOJu0Yxk5E2Z8lbL487QrUb9ig1SzxWaXlAslp2aFem25k828kF4uf2E rQXJOLTuFxJBEhnBGdf2b9rHGVhwWMzJdJbpAwuYKfKfT7kmczRYAebBKpWy+aQU X-Gm-Gg: ATEYQzxPDBlUfL0mHeobkRKiT4luDeneANajpVqmKKyJzwBW7SOaUHeX09xPyQAMDzd ggqkX9CohLyaH2P9ymjLMGcdgGcbbZyWRyfaJH7MOjLOVOOTNjOYSc7Oz25TYKtkA0D/OR/wh+m mupbonqYyipccpzlmoII/OrpGU6nVAVe7YBzWe9OCIXueb8crvwh9zKlOMAVaj6xPxO8ByQfHXk mUqp1oSQZv/qySX2bJ8PynhhXw9T6A7Osj3XKMZsiFU2uyuYRUyJ6SSj1Y7UWBHegYr6raCLNgg cUygHal9q2VLmmVXaRw5z4vPs8x9PGHDtUQCnc7B/Px05P0ih/8yAns5Hg4e/9I600brej7CJjO 0D2yad1egDXTbAU5JhSh1mNjCKn8EIqnr8mCIDfiXImFHeU6QIQosEVn0tQldBWCG+THkC8+QiN bbpONg30lHpN+wzy6/12fFrpLcDxJw2wPonqI= X-Received: by 2002:a05:6000:2003:b0:43b:5761:f232 with SMTP id ffacd0b85a97d-43b642819d6mr10564851f8f.43.1774099115065; Sat, 21 Mar 2026 06:18:35 -0700 (PDT) Received: from fedora ([81.6.40.67]) by smtp.gmail.com with ESMTPSA id ffacd0b85a97d-43b644bf1c5sm15004285f8f.14.2026.03.21.06.18.33 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sat, 21 Mar 2026 06:18:34 -0700 (PDT) From: stondo@gmail.com To: openembedded-core@lists.openembedded.org Cc: Ross.Burton@arm.com, jpewhacker@gmail.com, stefano.tondo.ext@siemens.com, Peter.Marko@siemens.com, adrian.freihofer@siemens.com, mathieu.dubois-briand@bootlin.com Subject: [OE-core][PATCH v11 4/4] oeqa/selftest: Add tests for source download enrichment Date: Sat, 21 Mar 2026 14:18:26 +0100 Message-ID: <20260321131826.1401671-5-stondo@gmail.com> X-Mailer: git-send-email 2.53.0 In-Reply-To: <20260321131826.1401671-1-stondo@gmail.com> References: <20260321131826.1401671-1-stondo@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from 45-33-107-173.ip.linodeusercontent.com [45.33.107.173] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Sat, 21 Mar 2026 13:18:46 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/233661 From: Stefano Tondo Add two new SPDX 3.0 selftest cases: test_download_location_defensive_handling: Verifies SPDX generation succeeds for recipes with tarball sources and that external references are properly structured (ExternalRef locator is a list of strings per SPDX 3.0 spec). test_version_extraction_patterns: Verifies that version extraction works correctly and all source packages have proper version strings containing digits. These tests validate the source download enrichment added in the previous commit. Signed-off-by: Stefano Tondo --- meta/lib/oeqa/selftest/cases/spdx.py | 104 +++++++++++++++++++++------ 1 file changed, 83 insertions(+), 21 deletions(-) diff --git a/meta/lib/oeqa/selftest/cases/spdx.py b/meta/lib/oeqa/selftest/cases/spdx.py index af1144c1e5..140d3debba 100644 --- a/meta/lib/oeqa/selftest/cases/spdx.py +++ b/meta/lib/oeqa/selftest/cases/spdx.py @@ -141,29 +141,15 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): SPDX_CLASS = "create-spdx-3.0" def test_base_files(self): - self.check_recipe_spdx( - "base-files", - "{DEPLOY_DIR_SPDX}/{MACHINE_ARCH}/static/static-base-files.spdx.json", - task="create_recipe_spdx", - ) self.check_recipe_spdx( "base-files", "{DEPLOY_DIR_SPDX}/{MACHINE_ARCH}/packages/package-base-files.spdx.json", ) - def test_world_sbom(self): - objset = self.check_recipe_spdx( - "meta-world-recipe-sbom", - "{DEPLOY_DIR_IMAGE}/world-recipe-sbom.spdx.json", - ) - - # Document should be fully linked - self.check_objset_missing_ids(objset) - def test_gcc_include_source(self): objset = self.check_recipe_spdx( "gcc", - "{DEPLOY_DIR_SPDX}/{SSTATE_PKGARCH}/builds/build-gcc.spdx.json", + "{DEPLOY_DIR_SPDX}/{SSTATE_PKGARCH}/recipes/recipe-gcc.spdx.json", extraconf="""\ SPDX_INCLUDE_SOURCES = "1" """, @@ -176,12 +162,12 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): if software_file.name == filename: found = True self.logger.info( - f"The spdxId of {filename} in build-gcc.spdx.json is {software_file.spdxId}" + f"The spdxId of {filename} in recipe-gcc.spdx.json is {software_file.spdxId}" ) break self.assertTrue( - found, f"Not found source file {filename} in build-gcc.spdx.json\n" + found, f"Not found source file {filename} in recipe-gcc.spdx.json\n" ) def test_core_image_minimal(self): @@ -319,7 +305,7 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): # This will fail with NameError if new_annotation() is called incorrectly objset = self.check_recipe_spdx( "base-files", - "{DEPLOY_DIR_SPDX}/{MACHINE_ARCH}/builds/build-base-files.spdx.json", + "{DEPLOY_DIR_SPDX}/{MACHINE_ARCH}/recipes/recipe-base-files.spdx.json", extraconf=textwrap.dedent( f"""\ ANNOTATION1 = "{ANNOTATION_VAR1}" @@ -374,8 +360,8 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): def test_kernel_config_spdx(self): kernel_recipe = get_bb_var("PREFERRED_PROVIDER_virtual/kernel") - spdx_file = f"build-{kernel_recipe}.spdx.json" - spdx_path = f"{{DEPLOY_DIR_SPDX}}/{{SSTATE_PKGARCH}}/builds/{spdx_file}" + spdx_file = f"recipe-{kernel_recipe}.spdx.json" + spdx_path = f"{{DEPLOY_DIR_SPDX}}/{{SSTATE_PKGARCH}}/recipes/{spdx_file}" # Make sure kernel is configured first bitbake(f"-c configure {kernel_recipe}") @@ -383,7 +369,7 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): objset = self.check_recipe_spdx( kernel_recipe, spdx_path, - task="do_create_spdx", + task="do_create_kernel_config_spdx", extraconf="""\ INHERIT += "create-spdx" SPDX_INCLUDE_KERNEL_CONFIG = "1" @@ -428,3 +414,79 @@ class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): value, ["enabled", "disabled"], f"Unexpected PACKAGECONFIG value '{value}' for {key}" ) + + def test_download_location_defensive_handling(self): + """Test that download_location handling is defensive. + + Verifies SPDX generation succeeds and external references are + properly structured when download_location retrieval works. + """ + objset = self.check_recipe_spdx( + "m4", + "{DEPLOY_DIR_SPDX}/{SSTATE_PKGARCH}/builds/build-m4.spdx.json", + ) + + found_external_refs = False + for pkg in objset.foreach_type(oe.spdx30.software_Package): + if pkg.externalRef: + found_external_refs = True + for ref in pkg.externalRef: + self.assertIsNotNone(ref.externalRefType) + self.assertIsNotNone(ref.locator) + self.assertGreater(len(ref.locator), 0, "Locator should have at least one entry") + for loc in ref.locator: + self.assertIsInstance(loc, str) + break + + self.logger.info( + f"External references {'found' if found_external_refs else 'not found'} " + f"in SPDX output (defensive handling verified)" + ) + + def test_version_extraction_patterns(self): + """Test that version extraction works for various package formats. + + Verifies that Git source downloads carry extracted versions and that + the reported version strings are well-formed. + """ + objset = self.check_recipe_spdx( + "opkg-utils", + "{DEPLOY_DIR_SPDX}/{SSTATE_PKGARCH}/builds/build-opkg-utils.spdx.json", + ) + + # Collect all packages with versions + packages_with_versions = [] + for pkg in objset.foreach_type(oe.spdx30.software_Package): + if pkg.software_packageVersion: + packages_with_versions.append((pkg.name, pkg.software_packageVersion)) + + self.assertGreater( + len(packages_with_versions), 0, + "Should find packages with extracted versions" + ) + + for name, version in packages_with_versions: + self.assertRegex( + version, + r"^[0-9a-f]{40}$", + f"Expected Git source version for {name} to be a full SHA-1", + ) + + self.logger.info(f"Found {len(packages_with_versions)} packages with versions") + + # Log some examples for debugging + for name, version in packages_with_versions[:5]: + self.logger.info(f" {name}: {version}") + + # Verify that versions follow expected patterns + for name, version in packages_with_versions: + # Version should not be empty + self.assertIsNotNone(version) + self.assertNotEqual(version, "") + + # Version should contain digits + self.assertRegex( + version, + r'\d', + f"Version '{version}' for package '{name}' should contain digits" + )