From patchwork Fri Jul 12 15:58:11 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46263 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 01626C2BD09 for ; Fri, 12 Jul 2024 16:03:12 +0000 (UTC) Received: from mail-oa1-f42.google.com (mail-oa1-f42.google.com [209.85.160.42]) by mx.groups.io with SMTP id smtpd.web10.11469.1720800191022127867 for ; Fri, 12 Jul 2024 09:03:11 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=abUTodHy; spf=pass (domain: gmail.com, ip: 209.85.160.42, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f42.google.com with SMTP id 586e51a60fabf-2598001aae7so1130196fac.2 for ; Fri, 12 Jul 2024 09:03:10 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800189; x=1721404989; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=ev+/83TYJtK21/pmzMESC9PzCYUOYVnNIt17ZOUIHdo=; b=abUTodHySQBAsZIZxMrxVuc/4u2dCQyRu50zWKi03RBChAUYCZGbCfMqS99k+4yFaX nKErwc+mxtAAnJKwXRn7BgRODKz0AURmS0xXogvBKOCU5bCikzekoPM8xoUxNVIeUzgH tIzwBFF079c7AWaUA93vdz0LnLYbpN4XeHE4yVLQPhDdR6jQSoJ/6Nk3yMuPvO2oh8DF NRu0ffl2LMML7wNlIKZ0eVqGfV0kjFd+7MX7o1CPJ4kjmBhESBEiBZ319bv1xzwuS8Ff E72Ix2/NNyAXIo+SzHSmKhcAyd+Z44mhhoDbQOYS0SAc8HetkjZhNwdqX9KipI2brnjV eLAg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800189; x=1721404989; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=ev+/83TYJtK21/pmzMESC9PzCYUOYVnNIt17ZOUIHdo=; b=F/NsQ/tIvTwnENvyguQxpA45gKsApx0tcxmDwAGmbhcBe1u6wFt5sD7WNZ+PJR+JbO /bKONJHrCpKoO5358PueTy9bv9SzX7R5Mwh/eUP0g5f3+BYooePJAVmRIjf42s7WTAgt l4WOqXmwytsojN4gwn2I6Kb9l6fVYTwiO3AM1fE/zqFHuZ6bc9ZVSOn+Uk18gJtjih8i HAYBN1JEnM470mj0yC2aULZz79abVFlvSFH09jhaLRM8fr6NReAzPgWfyjMrnz0Aah8r ACjFvm63pFzmZAEVy+bjzS7Zs76llWas/NHxczjqZSnKnBBrX8XzRwDw8u821qKsn3xU I0fw== X-Gm-Message-State: AOJu0YwSg9r+Vrn8rcb2YhTQswQPNfxjLMqnDot2XfXlb5hb8rW41oQ3 attmcbx1CaSZETlM5iEOB0WGU68OZo/zK9vCCk1nBMZX5DMDHnonrp2y+A== X-Google-Smtp-Source: AGHT+IHuGHIpW4eN1mT/yAPr96LhA5A9QFnr4PhqJf3G7rU6vGU6MOQ+kfurAIbpNLnt1NjfC8h+3Q== X-Received: by 2002:a05:6870:a454:b0:25e:b6dc:46a with SMTP id 586e51a60fabf-25eb6dc391fmr8586315fac.11.1720800189037; Fri, 12 Jul 2024 09:03:09 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.08 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:08 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 01/12] classes-recipe/image: Add image file manifest Date: Fri, 12 Jul 2024 09:58:11 -0600 Message-ID: <20240712160304.3514496-2-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:12 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201835 Downstream tasks may want to know what image files were written by the do_image family of tasks (e.g. SPDX) so have each task write out a manifest file that describes the files it produced, then aggregate them in do_image_complete Signed-off-by: Joshua Watt --- meta/classes-recipe/image.bbclass | 58 +++++++++++++++++++++++++++++++ 1 file changed, 58 insertions(+) diff --git a/meta/classes-recipe/image.bbclass b/meta/classes-recipe/image.bbclass index 28be6c63623..32bafcdbbc3 100644 --- a/meta/classes-recipe/image.bbclass +++ b/meta/classes-recipe/image.bbclass @@ -88,6 +88,11 @@ PACKAGE_INSTALL_ATTEMPTONLY ?= "${FEATURE_INSTALL_OPTIONAL}" IMGDEPLOYDIR = "${WORKDIR}/deploy-${PN}-image-complete" +IMGMANIFESTDIR = "${WORKDIR}/image-task-manifest" + +IMAGE_OUTPUT_MANIFEST_DIR = "${WORKDIR}/deploy-image-output-manifest" +IMAGE_OUTPUT_MANIFEST = "${IMAGE_OUTPUT_MANIFEST_DIR}/manifest.json" + # Images are generally built explicitly, do not need to be part of world. EXCLUDE_FROM_WORLD = "1" @@ -277,14 +282,28 @@ fakeroot python do_image () { execute_pre_post_process(d, pre_process_cmds) } do_image[dirs] = "${TOPDIR}" +do_image[cleandirs] += "${IMGMANIFESTDIR}" addtask do_image after do_rootfs fakeroot python do_image_complete () { from oe.utils import execute_pre_post_process + from pathlib import Path + import json post_process_cmds = d.getVar("IMAGE_POSTPROCESS_COMMAND") execute_pre_post_process(d, post_process_cmds) + + image_manifest_dir = Path(d.getVar('IMGMANIFESTDIR')) + + data = [] + + for manifest_path in image_manifest_dir.glob("*.json"): + with manifest_path.open("r") as f: + data.extend(json.load(f)) + + with open(d.getVar("IMAGE_OUTPUT_MANIFEST"), "w") as f: + json.dump(data, f) } do_image_complete[dirs] = "${TOPDIR}" SSTATETASKS += "do_image_complete" @@ -292,6 +311,8 @@ SSTATE_SKIP_CREATION:task-image-complete = '1' do_image_complete[sstate-inputdirs] = "${IMGDEPLOYDIR}" do_image_complete[sstate-outputdirs] = "${DEPLOY_DIR_IMAGE}" do_image_complete[stamp-extra-info] = "${MACHINE_ARCH}" +do_image_complete[sstate-plaindirs] += "${IMAGE_OUTPUT_MANIFEST_DIR}" +do_image_complete[dirs] += "${IMAGE_OUTPUT_MANIFEST_DIR}" addtask do_image_complete after do_image before do_build python do_image_complete_setscene () { sstate_setscene(d) @@ -507,12 +528,14 @@ python () { d.setVar(task, '\n'.join(cmds)) d.setVarFlag(task, 'func', '1') d.setVarFlag(task, 'fakeroot', '1') + d.setVarFlag(task, 'imagetype', t) d.appendVarFlag(task, 'prefuncs', ' ' + debug + ' set_image_size') d.prependVarFlag(task, 'postfuncs', 'create_symlinks ') d.appendVarFlag(task, 'subimages', ' ' + ' '.join(subimages)) d.appendVarFlag(task, 'vardeps', ' ' + ' '.join(vardeps)) d.appendVarFlag(task, 'vardepsexclude', ' DATETIME DATE ' + ' '.join(vardepsexclude)) + d.appendVarFlag(task, 'postfuncs', ' write_image_output_manifest') bb.debug(2, "Adding task %s before %s, after %s" % (task, 'do_image_complete', after)) bb.build.addtask(task, 'do_image_complete', after, d) @@ -610,6 +633,41 @@ python create_symlinks() { bb.note("Skipping symlink, source does not exist: %s -> %s" % (dst, src)) } +python write_image_output_manifest() { + import json + from pathlib import Path + + taskname = d.getVar("BB_CURRENTTASK") + image_deploy_dir = Path(d.getVar('IMGDEPLOYDIR')) + image_manifest_dir = Path(d.getVar('IMGMANIFESTDIR')) + manifest_path = image_manifest_dir / ("do_" + d.getVar("BB_CURRENTTASK") + ".json") + + image_name = d.getVar("IMAGE_NAME") + image_basename = d.getVar("IMAGE_BASENAME") + machine = d.getVar("MACHINE") + + subimages = (d.getVarFlag("do_" + taskname, 'subimages', False) or "").split() + imagetype = d.getVarFlag("do_" + taskname, 'imagetype', False) + + data = { + "taskname": taskname, + "imagetype": imagetype, + "images": [] + } + + for type in subimages: + image_filename = image_name + "." + type + image_path = image_deploy_dir / image_filename + if not image_path.exists(): + continue + data["images"].append({ + "filename": image_filename, + }) + + with manifest_path.open("w") as f: + json.dump([data], f) +} + MULTILIBRE_ALLOW_REP += "${base_bindir} ${base_sbindir} ${bindir} ${sbindir} ${libexecdir} ${sysconfdir} ${nonarch_base_libdir}/udev /lib/modules/[^/]*/modules.*" MULTILIB_CHECK_FILE = "${WORKDIR}/multilib_check.py" MULTILIB_TEMP_ROOTFS = "${WORKDIR}/multilib" From patchwork Fri Jul 12 15:58:12 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46262 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 53424C3DA4D for ; Fri, 12 Jul 2024 16:03:13 +0000 (UTC) Received: from mail-ot1-f52.google.com (mail-ot1-f52.google.com [209.85.210.52]) by mx.groups.io with SMTP id smtpd.web11.11588.1720800191783117006 for ; Fri, 12 Jul 2024 09:03:11 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=GrXAtzbR; spf=pass (domain: gmail.com, ip: 209.85.210.52, mailfrom: jpewhacker@gmail.com) Received: by mail-ot1-f52.google.com with SMTP id 46e09a7af769-703d5b29e06so1039764a34.2 for ; Fri, 12 Jul 2024 09:03:11 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800190; x=1721404990; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=XAPW56oVcSidcfVF6Ske72luKN0hEzBf10jZJ7PBYUQ=; b=GrXAtzbR2eGmr71pdv5xzVAoGITDwW/T7D63/7R+z/PQZPdIZiSA6NncpLEDCWw7WQ WAIdXZLbi52GzdGHSdKOFAbJT98BRjDgYJ3koRHmbOjqVsgS++pgGtk/ntEzDKTAV9ox MzFTVG/bY64VbTB/9NmD+neVN/owe6EK9ed3wHc/rvHQAZz5fEh6V3gKmQveWYMg3kR/ 0Fqpj9N0r2IdTi9nUAcwu+uNmtfnyV0tRjEMc4Fx55MnLw1szcop3+NQzQeg6v8ynGBn KWXbk1is+m/L/3Be9wjsATXpQ23nCtUc4ixYr6EKZqEhbOjTn+Sj3MZ0mXJncuaK2c5s nmXw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800190; x=1721404990; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=XAPW56oVcSidcfVF6Ske72luKN0hEzBf10jZJ7PBYUQ=; b=HgUfXx2prWKYxumiDlsfNnAXv6wsYHuPj+8II0rQhhq9+fN33RegssHoWfBNb5AJyP VDqIRQT9u7bmFarm2TVeOYJReJA3K/+73eepygjbx0lczR1eDkGaN/LEx4cx4hElTQpw gZZcEBTXjnzu9GtLyTViNeHmDkdqVM4bxmNhqxRX9uSOIV7ctT84NMZ0L/oy1w7z18lc X/0kK7lsdWHyx6L3wfq+bV/5VtHioTyMMKHHEbx5bbTW2e/O4ASFtoIS4JhaEWj19FMH 3irroQ6rguUPjb3xrU37rJXmpIG42JqfMPghIQ2Jg3CE+3QHpisONlX+nJMsOMJDDV8S 0Gzw== X-Gm-Message-State: AOJu0YxLap3xnY8CNNhIJEpX9fpNG+JZp1lXGP/Z88TM4061U3W/Zr3X 8yJXs0e4ku+BbLTDUMnZBtda1cVTjeeZQIBEEVLgwe3OvZtzwmviUrCO7A== X-Google-Smtp-Source: AGHT+IHqnZalOsTIhXKTC/TGy94mD95woPaLtJFJAbc6F7UrEoL01gQs/aauEC2Ssy5hin4PLO6lyg== X-Received: by 2002:a05:6871:b23:b0:25e:2b26:e05e with SMTP id 586e51a60fabf-25eae83c5demr8743914fac.23.1720800190048; Fri, 12 Jul 2024 09:03:10 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.09 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:09 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 02/12] classes-recipe/baremetal-image: Add image file manifest Date: Fri, 12 Jul 2024 09:58:12 -0600 Message-ID: <20240712160304.3514496-3-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:13 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201836 Downstream tasks may want to know what image files were written so write out a manifest in do_image_complete. The format of the manifest is the same as the one in image.bbclass Signed-off-by: Joshua Watt --- meta/classes-recipe/baremetal-image.bbclass | 32 +++++++++++++++++++-- 1 file changed, 29 insertions(+), 3 deletions(-) diff --git a/meta/classes-recipe/baremetal-image.bbclass b/meta/classes-recipe/baremetal-image.bbclass index 4e7d413626e..7938c0a83f1 100644 --- a/meta/classes-recipe/baremetal-image.bbclass +++ b/meta/classes-recipe/baremetal-image.bbclass @@ -30,6 +30,9 @@ BAREMETAL_BINNAME ?= "hello_baremetal_${MACHINE}" IMAGE_LINK_NAME ?= "baremetal-helloworld-image-${MACHINE}" IMAGE_NAME_SUFFIX ?= "" +IMAGE_OUTPUT_MANIFEST_DIR = "${WORKDIR}/deploy-image-output-manifest" +IMAGE_OUTPUT_MANIFEST = "${IMAGE_OUTPUT_MANIFEST_DIR}/manifest.json" + do_rootfs[dirs] = "${IMGDEPLOYDIR} ${DEPLOY_DIR_IMAGE}" do_image(){ @@ -37,8 +40,28 @@ do_image(){ install ${D}/${base_libdir}/firmware/${BAREMETAL_BINNAME}.elf ${IMGDEPLOYDIR}/${IMAGE_LINK_NAME}.elf } -do_image_complete(){ - : +python do_image_complete(){ + from pathlib import Path + import json + + data = { + "taskname": "do_image", + "imagetype": "baremetal-image", + "images": [] + } + + img_deploy_dir = Path(d.getVar("IMGDEPLOYDIR")) + + for child in img_deploy_dir.iterdir(): + if not child.is_file() or child.is_symlink(): + continue + + data["images"].append({ + "filename": child.name, + }) + + with open(d.getVar("IMAGE_OUTPUT_MANIFEST"), "w") as f: + json.dump([data], f) } python do_rootfs(){ @@ -62,6 +85,7 @@ python do_rootfs(){ bb.utils.mkdirhier(sysconfdir) execute_pre_post_process(d, d.getVar('ROOTFS_POSTPROCESS_COMMAND')) + execute_pre_post_process(d, d.getVar("ROOTFS_POSTUNINSTALL_COMMAND")) } @@ -72,6 +96,8 @@ SSTATE_SKIP_CREATION:task-image-complete = '1' do_image_complete[sstate-inputdirs] = "${IMGDEPLOYDIR}" do_image_complete[sstate-outputdirs] = "${DEPLOY_DIR_IMAGE}" do_image_complete[stamp-extra-info] = "${MACHINE_ARCH}" +do_image_complete[sstate-plaindirs] += "${IMAGE_OUTPUT_MANIFEST_DIR}" +do_image_complete[dirs] += "${IMAGE_OUTPUT_MANIFEST_DIR}" addtask do_image_complete after do_image before do_build python do_image_complete_setscene () { @@ -140,5 +166,5 @@ python(){ else: deps += " %s:%s" % (dep, task) return deps - d.appendVarFlag('do_image', 'depends', extraimage_getdepends('do_populate_sysroot')) + d.appendVarFlag('do_image', 'depends', extraimage_getdepends('do_populate_sysroot')) } From patchwork Fri Jul 12 15:58:13 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46273 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 119FCC3DA4D for ; Fri, 12 Jul 2024 16:03:23 +0000 (UTC) Received: from mail-oa1-f51.google.com (mail-oa1-f51.google.com [209.85.160.51]) by mx.groups.io with SMTP id smtpd.web11.11592.1720800196339462033 for ; Fri, 12 Jul 2024 09:03:16 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=P8ocYMcI; spf=pass (domain: gmail.com, ip: 209.85.160.51, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f51.google.com with SMTP id 586e51a60fabf-25d6dd59170so1149813fac.0 for ; Fri, 12 Jul 2024 09:03:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800195; x=1721404995; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=+B3ulvsxrLJjWKFCL8M1E23pfn+WagSeVpIdrxgkBlw=; b=P8ocYMcIsX4YrDsUtVOAfwZyAnT/IxEyHZ2/F1uKVdSYyQIIYuDTqpiQqaRjnmeuVE thJhL7bSC+ptwwji5S75zvTNgnc0xZbynafK6KoDjllVb1KrsBn1AKyudnLtmcW1iaDP +MdxFe9rv2kUIC+G+PWp9AU/B3Ai6Mn6+TSuItN+v4MDl6YFKveJLBodoO1V8/mmYZ9V LEfXGsV3KEcD/g2GOZJuAPvin+QH6JVyA0jY2v6Y1P2iZSF6KLqFEitQpTtbgitMfzZT xhXUZlnTnTwHsH8IuFYUpqJhyQc1+H0ETXkYTRY3ssqIBSJkkNiw0Ar3Ta9PzHHhoVzG 6H3g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800195; x=1721404995; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=+B3ulvsxrLJjWKFCL8M1E23pfn+WagSeVpIdrxgkBlw=; b=OqLYVsDF87HJ7E5imXFh9zJqDSYjMq9s6GrRPtg6Jo5GDHWSndMDkMT8WeMTGvu8W/ JvXYu/6IhsZ3KC/8/H8lI1DR1ayxR6TGMWA7vAFp+cH5mXSQbYUvsRtEoug2Yboubhag 6BFcQzGLNUoAJqneDXCkHDw+n1BvS73BFOMyHzKWA2KRPu8ytVu1KXHxaaqy9SM1wxvB jovQhZNfYh4P4YpI+sr1f30BCiuCPiZ3TXTl86/6lOVhaRfCWgRKeVrzGqS3/rfsT7ow GYkpD7J/YBNeaBtiTyTlrRON5H5pFWAY/OuqvTzqQZGFNwZMnxi70hECree5saQtFN6i aJcQ== X-Gm-Message-State: AOJu0YwDF4tFtdYTdm8+UTVvXOfEP1VIcD7LEJ6MDVWG4jho4yWLxeKI eXv4LBgjKNCJsMwSZkzDylc++f5BSc/FlmgP6bl84w51ld+iOAMq0/D14Q== X-Google-Smtp-Source: AGHT+IHPuaL1XywJJ+iaY8c46pu26E0ehPKEr/UP+haihqWOnXdxhJy43eKtZ810Tqrldm9A/UZ2WQ== X-Received: by 2002:a05:6870:3117:b0:254:b318:8a05 with SMTP id 586e51a60fabf-25eaec74a0cmr9661566fac.57.1720800192270; Fri, 12 Jul 2024 09:03:12 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.10 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:10 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 03/12] classes/create-spdx-3.0: Add classes Date: Fri, 12 Jul 2024 09:58:13 -0600 Message-ID: <20240712160304.3514496-4-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:23 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201837 Adds a class to generate SPDX 3.0 output and an image class that is used when generating images Signed-off-by: Joshua Watt --- meta/classes-recipe/packagegroup.bbclass | 2 + meta/classes/create-spdx-3.0.bbclass | 1043 ++++ meta/classes/create-spdx-image-3.0.bbclass | 415 ++ meta/classes/spdx-common.bbclass | 6 +- meta/lib/oe/sbom30.py | 1138 ++++ meta/lib/oe/spdx30.py | 6020 ++++++++++++++++++++ 6 files changed, 8623 insertions(+), 1 deletion(-) create mode 100644 meta/classes/create-spdx-3.0.bbclass create mode 100644 meta/classes/create-spdx-image-3.0.bbclass create mode 100644 meta/lib/oe/sbom30.py create mode 100644 meta/lib/oe/spdx30.py diff --git a/meta/classes-recipe/packagegroup.bbclass b/meta/classes-recipe/packagegroup.bbclass index cf6fc354a81..2099c8b37d6 100644 --- a/meta/classes-recipe/packagegroup.bbclass +++ b/meta/classes-recipe/packagegroup.bbclass @@ -56,6 +56,8 @@ deltask do_populate_sysroot do_create_runtime_spdx[deptask] = "do_create_spdx" do_create_runtime_spdx[rdeptask] = "" +do_create_package_spdx[deptask] = "do_create_spdx" +do_create_package_spdx[rdeptask] = "" INHIBIT_DEFAULT_DEPS = "1" diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass new file mode 100644 index 00000000000..51168e4876c --- /dev/null +++ b/meta/classes/create-spdx-3.0.bbclass @@ -0,0 +1,1043 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +inherit spdx-common + +SPDX_VERSION = "3.0.0" + +# The list of SPDX profiles generated documents will conform to +SPDX_PROFILES ?= "core build software simpleLicensing security" + +SPDX_INCLUDE_BUILD_VARIABLES ??= "0" +SPDX_INCLUDE_BUILD_VARIABLES[doc] = "If set to '1', the bitbake variables for a \ + recipe will be included in the Build object. This will most likely result \ + in non-reproducible SPDX output" + +SPDX_INCLUDE_BITBAKE_PARENT_BUILD ??= "0" +SPDX_INCLUDE_BITBAKE_PARENT_BUILD[doc] = "Report the parent invocation of bitbake \ + for each Build object. This allows you to know who invoked bitbake to perform \ + a build, but will result in non-reproducible SPDX output." + +SPDX_PACKAGE_ADDITIONAL_PURPOSE ?= "" +SPDX_PACKAGE_ADDITIONAL_PURPOSE[doc] = "The list of additional purposes to assign to \ + the generated packages for a recipe. The primary purpose is always `install`. \ + Packages overrides are allowed to override the additional purposes for \ + individual packages." + +SPDX_IMAGE_PURPOSE ?= "filesystemImage" +SPDX_IMAGE_PURPOSE[doc] = "The list of purposes to assign to the generated images. \ + The first listed item will be the Primary Purpose and all additional items will \ + be added as additional purposes" + +SPDX_SDK_PURPOSE ?= "install" +SPDX_SDK_PURPOSE[doc] = "The list of purposes to assign to the generate SDK installer. \ + The first listed item will be the Primary Purpose and all additional items will \ + be added as additional purposes" + +SPDX_INCLUDE_VEX ??= "current" +SPDX_INCLUDE_VEX[doc] = "Controls what VEX information is in the output. Set to \ + 'none' to disable all VEX data. Set to 'current' to only include VEX data \ + for vulnerabilities not already fixed in the upstream source code \ + (recommended). Set to 'all' to get all known historical vulnerabilities, \ + including those already fixed upstream (warning: This can be large and \ + slow)." + +SPDX_INCLUDE_TIMESTAMPS ?= "0" +SPDX_INCLUDE_TIMESTAMPS[doc] = "Include time stamps in SPDX output. This is \ + useful if you want to know when artifacts were produced and when builds \ + occurred, but will result in non-reproducible SPDX output" + +SPDX_IMPORTS ??= "" +SPDX_IMPORTS[doc] = "SPDX_IMPORTS is the base variable that describes how to \ + reference external SPDX ids. Each import is defined as a key in this \ + variable with a suffix to describe to as a suffix to look up more \ + information about the import. Each key can have the following variables: \ + SPDX_IMPORTS__spdxid: The Fully qualified SPDX ID of the object \ + SPDX_IMPORTS__uri: The URI where the SPDX Document that contains \ + the external object can be found. Optional but recommended \ + SPDX_IMPORTS__hash_: The Checksum of the SPDX Document that \ + contains the External ID. must be one the valid SPDX hashing \ + algorithms, as described by the HashAlgorithm vocabulary in the\ + SPDX 3 spec. Optional but recommended" + +# Agents +# Bitbake variables can be used to describe an SPDX Agent that may be used +# during the build. An Agent is specified using a set of variables which all +# start with some common base name: +# +# _name: The name of the Agent (required) +# _type: The type of Agent. Must be one of "person", "organization", +# "software", or "agent" (the default if not specified) +# _comment: The comment for the Agent (optional) +# _id_: And External Identifier for the Agent. must be a valid +# ExternalIdentifierType from the SPDX 3 spec. Commonly, an E-mail address +# can be specified with _id_email +# +# Alternatively, an Agent can be an external reference by referencing a key +# in SPDX_IMPORTS like so: +# +# _import = "" +# +# Finally, the same agent described by another set of agent variables can be +# referenced by specifying the basename of the variable that should be +# referenced: +# +# SPDX_PACKAGE_SUPPLIER_ref = "SPDX_AUTHORS_openembedded" + +SPDX_AUTHORS ??= "openembedded" +SPDX_AUTHORS[doc] = "A space separated list of the document authors. Each item \ + is used to name a base variable like SPDX_AUTHORS_ that \ + describes the author." + +SPDX_AUTHORS_openembedded_name = "OpenEmbedded" +SPDX_AUTHORS_openembedded_type = "organization" + +SPDX_BUILD_HOST[doc] = "The base variable name to describe the build host on \ + which a build is running. Must be an SPDX_IMPORTS key. Requires \ + SPDX_INCLUDE_BITBAKE_PARENT_BUILD. NOTE: Setting this will result in \ + non-reproducible SPDX output" + +SPDX_INVOKED_BY[doc] = "The base variable name to describe the Agent that \ + invoked the build, which builds will link to if specified. Requires \ + SPDX_INCLUDE_BITBAKE_PARENT_BUILD. NOTE: Setting this will likely result in \ + non-reproducible SPDX output" + +SPDX_ON_BEHALF_OF[doc] = "The base variable name to describe the Agent on who's \ + behalf the invoking Agent (SPDX_INVOKED_BY) is running the build. Requires \ + SPDX_INCLUDE_BITBAKE_PARENT_BUILD. NOTE: Setting this will likely result in \ + non-reproducible SPDX output" + +SPDX_PACKAGE_SUPPLIER[doc] = "The base variable name to describe the Agent who \ + is supplying artifacts produced by the build" + + +IMAGE_CLASSES:append = " create-spdx-image-3.0" + +def set_timestamp_now(d, o, prop): + from datetime import datetime, timezone + + if d.getVar("SPDX_INCLUDE_TIMESTAMPS") == "1": + setattr(o, prop, datetime.now(timezone.utc)) + else: + # Doing this helps to validated that the property actually exists, and + # also that it is not mandatory + delattr(o, prop) + +set_timestamp_now[vardepsexclude] = "SPDX_INCLUDE_TIMESTAMPS" + +def add_license_expression(d, objset, license_expression): + from pathlib import Path + import oe.spdx30 + import oe.sbom30 + + license_data = d.getVar("SPDX_LICENSE_DATA") + simple_license_text = {} + license_text_map = {} + license_ref_idx = 0 + + def add_license_text(name): + nonlocal objset + nonlocal simple_license_text + + if name in simple_license_text: + return simple_license_text[name] + + lic = objset.find_filter( + oe.spdx30.simplelicensing_SimpleLicensingText, + name=name, + ) + + if lic is not None: + simple_license_text[name] = lic + return lic + + lic = objset.add(oe.spdx30.simplelicensing_SimpleLicensingText( + _id=objset.new_spdxid("license-text", name), + creationInfo=objset.doc.creationInfo, + name=name, + )) + simple_license_text[name] = lic + + if name == "PD": + lic.simplelicensing_licenseText = "Software released to the public domain" + return lic + + # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH + for directory in [d.getVar('COMMON_LICENSE_DIR')] + (d.getVar('LICENSE_PATH') or '').split(): + try: + with (Path(directory) / name).open(errors="replace") as f: + lic.simplelicensing_licenseText = f.read() + return lic + + except FileNotFoundError: + pass + + # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set + filename = d.getVarFlag('NO_GENERIC_LICENSE', name) + if filename: + filename = d.expand("${S}/" + filename) + with open(filename, errors="replace") as f: + lic.simplelicensing_licenseText = f.read() + return lic + else: + bb.fatal("Cannot find any text for license %s" % name) + + def convert(l): + nonlocal license_text_map + nonlocal license_ref_idx + + if l == "(" or l == ")": + return l + + if l == "&": + return "AND" + + if l == "|": + return "OR" + + if l == "CLOSED": + return "NONE" + + spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l + if spdx_license in license_data["licenses"]: + return spdx_license + + spdx_license = "LicenseRef-" + l + if spdx_license not in license_text_map: + license_text_map[spdx_license] = add_license_text(l)._id + + return spdx_license + + lic_split = license_expression.replace("(", " ( ").replace(")", " ) ").replace("|", " | ").replace("&", " & ").split() + spdx_license_expression = ' '.join(convert(l) for l in lic_split) + + return objset.new_license_expression(spdx_license_expression, license_text_map) + + +def add_package_files(d, objset, topdir, get_spdxid, get_purposes, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): + from pathlib import Path + import oe.spdx30 + import oe.sbom30 + + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + if source_date_epoch: + source_date_epoch = int(source_date_epoch) + + spdx_files = set() + + file_counter = 1 + for subdir, dirs, files in os.walk(topdir): + dirs[:] = [d for d in dirs if d not in ignore_dirs] + if subdir == str(topdir): + dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] + + for file in files: + filepath = Path(subdir) / file + if filepath.is_symlink() or not filepath.is_file(): + continue + + bb.debug(1, "Adding file %s to %s" % (filepath, objset.doc._id)) + + filename = str(filepath.relative_to(topdir)) + file_purposes = get_purposes(filepath) + + spdx_file = objset.new_file( + get_spdxid(file_counter), + filename, + filepath, + purposes=file_purposes, + ) + spdx_files.add(spdx_file) + + if oe.spdx30.software_SoftwarePurpose.source in file_purposes: + objset.scan_declared_licenses(spdx_file, filepath) + + if archive is not None: + with filepath.open("rb") as f: + info = archive.gettarinfo(fileobj=f) + info.name = filename + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > source_date_epoch: + info.mtime = source_date_epoch + + archive.addfile(info, f) + + file_counter += 1 + + return spdx_files + + +def get_package_sources_from_debug(d, package, package_files, sources, source_hash_cache): + from pathlib import Path + import oe.packagedata + + def file_path_match(file_path, pkg_file): + if file_path.lstrip("/") == pkg_file.name.lstrip("/"): + return True + + for e in pkg_file.extension: + if isinstance(e, oe.sbom30.OEFileNameAliasExtension): + for a in e.aliases: + if file_path.lstrip("/") == a.lstrip("/"): + return True + + return False + + debug_search_paths = [ + Path(d.getVar('PKGD')), + Path(d.getVar('STAGING_DIR_TARGET')), + Path(d.getVar('STAGING_DIR_NATIVE')), + Path(d.getVar('STAGING_KERNEL_DIR')), + ] + + pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) + + if pkg_data is None: + return + + dep_source_files = set() + + for file_path, file_data in pkg_data["files_info"].items(): + if not "debugsrc" in file_data: + continue + + if not any(file_path_match(file_path, pkg_file) for pkg_file in package_files): + bb.fatal("No package file found for %s in %s; SPDX found: %s" % (str(file_path), package, + " ".join(p.name for p in package_files))) + continue + + for debugsrc in file_data["debugsrc"]: + for search in debug_search_paths: + if debugsrc.startswith("/usr/src/kernel"): + debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') + else: + debugsrc_path = search / debugsrc.lstrip("/") + + if debugsrc_path in source_hash_cache: + file_sha256 = source_hash_cache[debugsrc_path] + if file_sha256 is None: + continue + else: + if not debugsrc_path.exists(): + source_hash_cache[debugsrc_path] = None + continue + + file_sha256 = bb.utils.sha256_file(debugsrc_path) + source_hash_cache[debugsrc_path] = file_sha256 + + if file_sha256 in sources: + source_file = sources[file_sha256] + dep_source_files.add(source_file) + else: + bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) + break + else: + bb.debug(1, "Debug source %s not found" % debugsrc) + + return dep_source_files + +get_package_sources_from_debug[vardepsexclude] += "STAGING_KERNEL_DIR" + +def collect_dep_objsets(d, build): + import json + from pathlib import Path + import oe.sbom30 + import oe.spdx30 + + deps = get_spdx_deps(d) + + dep_objsets = [] + dep_builds = set() + + dep_build_spdxids = set() + for dep_pn, _, in_taskhash in deps: + bb.debug(1, "Fetching SPDX for dependency %s" % (dep_pn)) + dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld(d, "recipes", dep_pn, oe.spdx30.build_Build) + # If the dependency is part of the taskhash, return it to be linked + # against. Otherwise, it cannot be linked against because this recipe + # will not rebuilt if dependency changes + if in_taskhash: + dep_objsets.append(dep_objset) + + # The build _can_ be linked against (by alias) + dep_builds.add(dep_build) + + return dep_objsets, dep_builds + +collect_dep_objsets[vardepsexclude] = "SSTATE_ARCHS" + +def collect_dep_sources(dep_objsets): + import oe.spdx30 + import oe.sbom30 + + sources = {} + for objset in dep_objsets: + # Don't collect sources from native recipes as they + # match non-native sources also. + if objset.is_native(): + continue + + bb.debug(1, "Fetching Sources for dependency %s" % (objset.doc.name)) + + dep_build = objset.find_root(oe.spdx30.build_Build) + if not dep_build: + bb.fatal("Unable to find a build") + + for e in objset.foreach_type(oe.spdx30.Relationship): + if dep_build is not e.from_: + continue + + if e.relationshipType != oe.spdx30.RelationshipType.hasInputs: + continue + + for to in e.to: + if not isinstance(to, oe.spdx30.software_File): + continue + + if to.software_primaryPurpose != oe.spdx30.software_SoftwarePurpose.source: + continue + + for v in to.verifiedUsing: + if v.algorithm == oe.spdx30.HashAlgorithm.sha256: + sources[v.hashValue] = to + break + else: + bb.fatal("No SHA256 found for %s in %s" % (to.name, objset.doc.name)) + + return sources + +def add_download_files(d, objset): + import oe.patch + import oe.spdx30 + import os + + inputs = set() + + urls = d.getVar("SRC_URI").split() + fetch = bb.fetch2.Fetch(urls, d) + + for download_idx, src_uri in enumerate(urls): + fd = fetch.ud[src_uri] + + for name in fd.names: + file_name = os.path.basename(fetch.localpath(src_uri)) + if oe.patch.patch_path(src_uri, fetch, '', expand=False): + primary_purpose = oe.spdx30.software_SoftwarePurpose.patch + else: + primary_purpose = oe.spdx30.software_SoftwarePurpose.source + + if fd.type == "file": + if os.path.isdir(fd.localpath): + walk_idx = 1 + for root, dirs, files in os.walk(fd.localpath): + for f in files: + f_path = os.path.join(root, f) + if os.path.islink(f_path): + # TODO: SPDX doesn't support symlinks yet + continue + + file = objset.new_file( + objset.new_spdxid("source", str(download_idx + 1), str(walk_idx)), + os.path.join(file_name, os.path.relpath(f_path, fd.localpath)), + f_path, + purposes=[primary_purpose], + ) + + inputs.add(file) + walk_idx += 1 + + else: + file = objset.new_file( + objset.new_spdxid("source", str(download_idx + 1)), + file_name, + fd.localpath, + purposes=[primary_purpose], + ) + inputs.add(file) + + else: + uri = fd.type + proto = getattr(fd, "proto", None) + if proto is not None: + uri = uri + "+" + proto + uri = uri + "://" + fd.host + fd.path + + if fd.method.supports_srcrev(): + uri = uri + "@" + fd.revisions[name] + + dl = objset.add(oe.spdx30.software_Package( + _id=objset.new_spdxid("source", str(download_idx + 1)), + creationInfo=objset.doc.creationInfo, + name=file_name, + software_primaryPurpose=primary_purpose, + software_downloadLocation=uri, + )) + + if fd.method.supports_checksum(fd): + # TODO Need something better than hard coding this + for checksum_id in ["sha256", "sha1"]: + expected_checksum = getattr(fd, "%s_expected" % checksum_id, None) + if expected_checksum is None: + continue + + dl.verifiedUsing.append( + oe.spdx30.Hash( + algorithm=getattr(oe.spdx30.HashAlgorithm, checksum_id), + hashValue=expected_checksum, + ) + ) + + inputs.add(dl) + + return inputs + + +def set_purposes(d, element, *var_names, force_purposes=[]): + purposes = force_purposes[:] + + for var_name in var_names: + val = d.getVar(var_name) + if val: + purposes.extend(val.split()) + break + + if not purposes: + bb.warn("No SPDX purposes found in %s" % " ".join(var_names)) + return + + element.software_primaryPurpose = getattr(oe.spdx30.software_SoftwarePurpose, purposes[0]) + element.software_additionalPurpose = [getattr(oe.spdx30.software_SoftwarePurpose, p) for p in purposes[1:]] + + +python do_create_spdx() { + import oe.sbom30 + import oe.spdx30 + from pathlib import Path + from contextlib import contextmanager + import oe.cve_check + from datetime import datetime + + def set_var_field(var, obj, name, package=None): + val = None + if package: + val = d.getVar("%s:%s" % (var, package)) + + if not val: + val = d.getVar(var) + + if val: + setattr(obj, name, val) + + deploydir = Path(d.getVar("SPDXDEPLOY")) + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_workdir = Path(d.getVar("SPDXWORK")) + include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" + pkg_arch = d.getVar("SSTATE_PKGARCH") + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) + include_vex = d.getVar("SPDX_INCLUDE_VEX") + if not include_vex in ("none", "current", "all"): + bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") + + build_objset = oe.sbom30.ObjectSet.new_objset(d, d.getVar("PN")) + + build = build_objset.new_task_build("recipe", "recipe") + build_objset.doc.rootElement.append(build) + + build_objset.set_is_native(is_native) + + for var in (d.getVar('SPDX_CUSTOM_ANNOTATION_VARS') or "").split(): + new_annotation( + d, + build_objset, + build, + "%s=%s" % (var, d.getVar(var)), + oe.spdx30.AnnotationType.other + ) + + build_inputs = set() + + # Add CVEs + cve_by_status = {} + if include_vex != "none": + for cve in (d.getVarFlags("CVE_STATUS") or {}): + status, detail, description = oe.cve_check.decode_cve_status(d, cve) + + # If this CVE is fixed upstream, skip it unless all CVEs are + # specified. + if include_vex != "all" and detail in ("fixed-version", "cpe-stable-backport"): + bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) + continue + + cve_by_status.setdefault(status, {})[cve] = ( + build_objset.new_cve_vuln(cve), + detail, + description, + ) + + cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) + + source_files = add_download_files(d, build_objset) + build_inputs |= source_files + + recipe_spdx_license = add_license_expression(d, build_objset, d.getVar("LICENSE")) + build_objset.new_relationship( + source_files, + oe.spdx30.RelationshipType.hasConcludedLicense, + [recipe_spdx_license], + ) + + if process_sources(d) and include_sources: + bb.debug(1, "Adding source files to SPDX") + spdx_get_src(d) + + build_inputs |= add_package_files( + d, + build_objset, + spdx_workdir, + lambda file_counter: build_objset.new_spdxid("sourcefile", str(file_counter)), + lambda filepath: [oe.spdx30.software_SoftwarePurpose.source], + ignore_dirs=[".git"], + ignore_top_level_dirs=["temp"], + archive=None, + ) + + + dep_objsets, dep_builds = collect_dep_objsets(d, build) + if dep_builds: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.dependsOn, + oe.spdx30.LifecycleScopeType.build, + sorted(oe.sbom30.get_element_link_id(b) for b in dep_builds), + ) + + debug_source_ids = set() + source_hash_cache = {} + + # Write out the package SPDX data now. It is not complete as we cannot + # write the runtime data, so write it to a staging area and a later task + # will write out the final collection + + # TODO: Handle native recipe output + if not is_native: + bb.debug(1, "Collecting Dependency sources files") + sources = collect_dep_sources(dep_objsets) + + bb.build.exec_func("read_subpackage_metadata", d) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + if not oe.packagedata.packaged(package, d): + continue + + pkg_name = d.getVar("PKG:%s" % package) or package + + bb.debug(1, "Creating SPDX for package %s" % pkg_name) + + pkg_objset = oe.sbom30.ObjectSet.new_objset(d, pkg_name) + + spdx_package = pkg_objset.add_root(oe.spdx30.software_Package( + _id=pkg_objset.new_spdxid("package", pkg_name), + creationInfo=pkg_objset.doc.creationInfo, + name=pkg_name, + software_packageVersion=d.getVar("PV"), + )) + set_timestamp_now(d, spdx_package, "builtTime") + + set_purposes( + d, + spdx_package, + "SPDX_PACKAGE_ADDITIONAL_PURPOSE:%s" % package, + "SPDX_PACKAGE_ADDITIONAL_PURPOSE", + force_purposes=["install"], + ) + + + supplier = build_objset.new_agent("SPDX_PACKAGE_SUPPLIER") + if supplier is not None: + spdx_package.supplier = supplier if isinstance(supplier, str) else supplier._id + + set_var_field("HOMEPAGE", spdx_package, "software_homePage", package=package) + set_var_field("SUMMARY", spdx_package, "summary", package=package) + set_var_field("DESCRIPTION", spdx_package, "description", package=package) + + pkg_objset.new_scoped_relationship( + [build._id], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [spdx_package], + ) + + for cpe_id in cpe_ids: + spdx_package.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.cpe23, + identifier=cpe_id, + )) + + # TODO: Generate a file for each actual IPK/DEB/RPM/TGZ file + # generated and link it to the package + #spdx_package_file = pkg_objset.add(oe.spdx30.software_File( + # _id=pkg_objset.new_spdxid("distribution", pkg_name), + # creationInfo=pkg_objset.doc.creationInfo, + # name=pkg_name, + # software_primaryPurpose=spdx_package.software_primaryPurpose, + # software_additionalPurpose=spdx_package.software_additionalPurpose, + #)) + #set_timestamp_now(d, spdx_package_file, "builtTime") + + ## TODO add hashes + #pkg_objset.new_relationship( + # [spdx_package], + # oe.spdx30.RelationshipType.hasDistributionArtifact, + # [spdx_package_file], + #) + + # NOTE: licenses live in the recipe collection and are referenced + # by ID in the package collection(s). This helps reduce duplication + # (since a lot of packages will have the same license), and also + # prevents duplicate license SPDX IDs in the packages + package_license = d.getVar("LICENSE:%s" % package) + if package_license and package_license != d.getVar("LICENSE"): + package_spdx_license = add_license_expression(d, build_objset, package_license) + else: + package_spdx_license = recipe_spdx_license + + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.hasConcludedLicense, + [package_spdx_license._id], + ) + + # NOTE: CVE Elements live in the recipe collection + all_cves = set() + for status, cves in cve_by_status.items(): + for cve, items in cves.items(): + spdx_cve, detail, description = items + + all_cves.add(spdx_cve._id) + + if status == "Patched": + pkg_objset.new_vex_patched_relationship([spdx_cve._id], [spdx_package]) + elif status == "Unpatched": + pkg_objset.new_vex_unpatched_relationship([spdx_cve._id], [spdx_package]) + elif status == "Ignored": + spdx_vex = pkg_objset.new_vex_ignored_relationship( + [spdx_cve._id], + [spdx_package], + impact_statement=description, + ) + + if detail in ("ignored", "cpe-incorrect", "disputed", "upstream-wontfix"): + # VEX doesn't have justifications for this + pass + elif detail in ("not-applicable-config", "not-applicable-platform"): + for v in spdx_vex: + v.security_justificationType = oe.spdx30.security_VexJustificationType.vulnerableCodeNotPresent + else: + bb.fatal(f"Unknown detail '{detail}' for ignored {cve}") + else: + bb.fatal(f"Unknown CVE status {status}") + + if all_cves: + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.hasAssociatedVulnerability, + sorted(list(all_cves)), + ) + + bb.debug(1, "Adding package files to SPDX for package %s" % pkg_name) + package_files = add_package_files( + d, + pkg_objset, + pkgdest / package, + lambda file_counter: pkg_objset.new_spdxid("package", pkg_name, "file", str(file_counter)), + # TODO: Can we know the purpose here? + lambda filepath: [], + ignore_top_level_dirs=['CONTROL', 'DEBIAN'], + archive=None, + ) + + if package_files: + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.contains, + sorted(list(package_files)), + ) + + if include_sources: + debug_sources = get_package_sources_from_debug(d, package, package_files, sources, source_hash_cache) + debug_source_ids |= set(oe.sbom30.get_element_link_id(d) for d in debug_sources) + + oe.sbom30.write_recipe_jsonld_doc(d, pkg_objset, "packages-staging", deploydir, create_spdx_id_links=False) + + if include_sources: + bb.debug(1, "Adding sysroot files to SPDX") + sysroot_files = add_package_files( + d, + build_objset, + d.expand("${COMPONENTS_DIR}/${PACKAGE_ARCH}/${PN}"), + lambda file_counter: build_objset.new_spdxid("sysroot", str(file_counter)), + lambda filepath: [], + archive=None, + ) + + if sysroot_files: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(sysroot_files)), + ) + + if build_inputs or debug_source_ids: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(build_inputs)) + sorted(list(debug_source_ids)), + ) + + oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "recipes", deploydir) +} +do_create_spdx[vardepsexclude] += "BB_NUMBER_THREADS" +addtask do_create_spdx after \ + do_collect_spdx_deps \ + do_deploy_source_date_epoch \ + do_populate_sysroot do_package do_packagedata \ + ${create_spdx_source_deps(d)} \ + before do_populate_sdk do_populate_sdk_ext do_build do_rm_work + +def create_spdx_source_deps(d): + deps = [] + if d.getVar("SPDX_INCLUDE_SOURCES") == "1": + deps.extend([ + # do_unpack is a hack for now; we only need it to get the + # dependencies do_unpack already has so we can extract the source + # ourselves + "do_unpack", + # For kernel source code + "do_shared_workdir", + ]) + return " ".join(deps) + +SSTATETASKS += "do_create_spdx" +do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}" +do_create_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_spdx_setscene + +do_create_spdx[dirs] = "${SPDXWORK}" +do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}" +do_create_spdx[depends] += "${PATCHDEPENDENCY}" + +python do_create_package_spdx() { + import oe.sbom30 + import oe.spdx30 + import oe.packagedata + from pathlib import Path + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) + + providers = collect_package_providers(d) + pkg_arch = d.getVar("SSTATE_PKGARCH") + + if not is_native: + bb.build.exec_func("read_subpackage_metadata", d) + + dep_package_cache = {} + + # Any element common to all packages that need to be referenced by ID + # should be written into this objset set + common_objset = oe.sbom30.ObjectSet.new_objset(d, "%s-package-common" % d.getVar("PN")) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + localdata = bb.data.createCopy(d) + pkg_name = d.getVar("PKG:%s" % package) or package + localdata.setVar("PKG", pkg_name) + localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) + + if not oe.packagedata.packaged(package, localdata): + continue + + spdx_package, pkg_objset = oe.sbom30.load_obj_in_jsonld( + d, + pkg_arch, + "packages-staging", + pkg_name, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + + # We will write out a new collection, so link it to the new + # creation info in the common package data. The old creation info + # should still exist and be referenced by all the existing elements + # in the package + pkg_objset.creationInfo = pkg_objset.copy_creation_info(common_objset.doc.creationInfo) + + runtime_spdx_deps = set() + + deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") + seen_deps = set() + for dep, _ in deps.items(): + if dep in seen_deps: + continue + + if dep not in providers: + continue + + (dep, _) = providers[dep] + + if not oe.packagedata.packaged(dep, localdata): + continue + + dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) + dep_pkg = dep_pkg_data["PKG"] + + if dep in dep_package_cache: + dep_spdx_package = dep_package_cache[dep] + else: + bb.debug(1, "Searching for %s" % dep_pkg) + dep_spdx_package, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "packages-staging", + dep_pkg, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + dep_package_cache[dep] = dep_spdx_package + + runtime_spdx_deps.add(dep_spdx_package) + seen_deps.add(dep) + + if runtime_spdx_deps: + pkg_objset.new_scoped_relationship( + [spdx_package], + oe.spdx30.RelationshipType.dependsOn, + oe.spdx30.LifecycleScopeType.runtime, + [oe.sbom30.get_element_link_id(dep) for dep in runtime_spdx_deps], + ) + + oe.sbom30.write_recipe_jsonld_doc(d, pkg_objset, "packages", deploydir) + + oe.sbom30.write_recipe_jsonld_doc(d, common_objset, "common-package", deploydir) +} + +do_create_package_spdx[vardepsexclude] += "OVERRIDES SSTATE_ARCHS" + +addtask do_create_package_spdx after do_create_spdx before do_build do_rm_work +SSTATETASKS += "do_create_package_spdx" +do_create_package_spdx[sstate-inputdirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_package_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_package_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_package_spdx_setscene + +do_create_package_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_package_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_package_spdx[rdeptask] = "do_create_spdx" + + + +python spdx30_build_started_handler () { + import oe.spdx30 + import oe.sbom30 + import os + from pathlib import Path + from datetime import datetime, timezone + + # Create a copy of the datastore. Set PN to "bitbake" so that SPDX IDs can + # be generated + d = e.data.createCopy() + d.setVar("PN", "bitbake") + d.setVar("BB_TASKHASH", "bitbake") + load_spdx_license_data(d) + + deploy_dir_spdx = Path(e.data.getVar("DEPLOY_DIR_SPDX")) + + objset = oe.sbom30.ObjectSet.new_objset(d, "bitbake", False) + + host_import_key = d.getVar("SPDX_BUILD_HOST") + invoked_by = objset.new_agent("SPDX_INVOKED_BY", add=False) + on_behalf_of = objset.new_agent("SPDX_ON_BEHALF_OF", add=False) + + if d.getVar("SPDX_INCLUDE_BITBAKE_PARENT_BUILD") == "1": + # Since the Build objects are unique, we may as well set the creation + # time to the current time instead of the fallback SDE + objset.doc.creationInfo.created = datetime.now(timezone.utc) + + # Each invocation of bitbake should have a unique ID since it is a + # unique build + nonce = os.urandom(16).hex() + + build = objset.add_root(oe.spdx30.build_Build( + _id=objset.new_spdxid(nonce, include_unihash=False), + creationInfo=objset.doc.creationInfo, + build_buildType=oe.sbom30.SPDX_BUILD_TYPE, + )) + set_timestamp_now(d, build, "build_buildStartTime") + + if host_import_key: + objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasHost, + oe.spdx30.LifecycleScopeType.build, + [objset.new_import("SPDX_BUILD_HOST")], + ) + + if invoked_by: + objset.add(invoked_by) + invoked_by_spdx = objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.invokedBy, + oe.spdx30.LifecycleScopeType.build, + [invoked_by], + ) + + if on_behalf_of: + objset.add(on_behalf_of) + objset.new_scoped_relationship( + [on_behalf_of], + oe.spdx30.RelationshipType.delegatedTo, + oe.spdx30.LifecycleScopeType.build, + invoked_by_spdx, + ) + + elif on_behalf_of: + bb.warn("SPDX_ON_BEHALF_OF has no effect if SPDX_INVOKED_BY is not set") + + else: + if host_import_key: + bb.warn("SPDX_BUILD_HOST has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") + + if invoked_by: + bb.warn("SPDX_INVOKED_BY has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") + + if on_behalf_of: + bb.warn("SPDX_ON_BEHALF_OF has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") + + for obj in objset.foreach_type(oe.spdx30.Element): + obj.extension.append(oe.sbom30.OELinkExtension(link_spdx_id=False)) + obj.extension.append(oe.sbom30.OEIdAliasExtension()) + + oe.sbom30.write_jsonld_doc(d, objset, deploy_dir_spdx / "bitbake.spdx.json") +} + +addhandler spdx30_build_started_handler +spdx30_build_started_handler[eventmask] = "bb.event.BuildStarted" + diff --git a/meta/classes/create-spdx-image-3.0.bbclass b/meta/classes/create-spdx-image-3.0.bbclass new file mode 100644 index 00000000000..bda11d54d40 --- /dev/null +++ b/meta/classes/create-spdx-image-3.0.bbclass @@ -0,0 +1,415 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# +# SPDX image tasks + +SPDX_ROOTFS_PACKAGES = "${SPDXDIR}/rootfs-packages.json" +SPDXIMAGEDEPLOYDIR = "${SPDXDIR}/image-deploy" +SPDXROOTFSDEPLOY = "${SPDXDIR}/rootfs-deploy" + +def collect_build_package_inputs(d, objset, build, packages): + providers = collect_package_providers(d) + + build_deps = set() + + for name in sorted(packages.keys()): + if name not in providers: + bb.fatal("Unable to find SPDX provider for '%s'" % name) + + pkg_name, pkg_hashfn = providers[name] + + # Copy all of the package SPDX files into the Sbom elements + pkg_spdx, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "packages", + pkg_name, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + build_deps.add(pkg_spdx._id) + + if build_deps: + objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(build_deps)), + ) + + +python spdx_collect_rootfs_packages() { + import json + from pathlib import Path + from oe.rootfs import image_list_installed_packages + + root_packages_file = Path(d.getVar("SPDX_ROOTFS_PACKAGES")) + + packages = image_list_installed_packages(d) + if not packages: + packages = {} + + root_packages_file.parent.mkdir(parents=True, exist_ok=True) + with root_packages_file.open("w") as f: + json.dump(packages, f) +} +ROOTFS_POSTUNINSTALL_COMMAND =+ "spdx_collect_rootfs_packages" + +python do_create_rootfs_spdx() { + import json + from pathlib import Path + import oe.spdx30 + import oe.sbom30 + from datetime import datetime + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + deploydir = Path(d.getVar("SPDXROOTFSDEPLOY")) + root_packages_file = Path(d.getVar("SPDX_ROOTFS_PACKAGES")) + image_basename = d.getVar("IMAGE_BASENAME") + machine = d.getVar("MACHINE") + + with root_packages_file.open("r") as f: + packages = json.load(f) + + objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) + + rootfs = objset.add_root(oe.spdx30.software_Package( + _id=objset.new_spdxid("rootfs", image_basename), + creationInfo=objset.doc.creationInfo, + name=image_basename, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, + )) + set_timestamp_now(d, rootfs, "builtTime") + + rootfs_build = objset.add_root(objset.new_task_build("rootfs", "rootfs")) + set_timestamp_now(d, rootfs_build, "build_buildEndTime") + + objset.new_scoped_relationship( + [rootfs_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs], + ) + + collect_build_package_inputs(d, objset, rootfs_build, packages) + + oe.sbom30.write_recipe_jsonld_doc(d, objset, "rootfs", deploydir) +} +addtask do_create_rootfs_spdx after do_rootfs before do_image +SSTATETASKS += "do_create_rootfs_spdx" +do_create_rootfs_spdx[sstate-inputdirs] = "${SPDXROOTFSDEPLOY}" +do_create_rootfs_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" +do_create_rootfs_spdx[recrdeptask] += "do_create_spdx do_create_package_spdx" +do_create_rootfs_spdx[cleandirs] += "${SPDXROOTFSDEPLOY}" + +python do_create_rootfs_spdx_setscene() { + sstate_setscene(d) +} +addtask do_create_rootfs_spdx_setscene + +python do_create_image_spdx() { + import oe.spdx30 + import oe.sbom30 + import json + from pathlib import Path + + image_deploy_dir = Path(d.getVar('IMGDEPLOYDIR')) + manifest_path = Path(d.getVar("IMAGE_OUTPUT_MANIFEST")) + spdx_work_dir = Path(d.getVar('SPDXIMAGEWORK')) + + image_basename = d.getVar('IMAGE_BASENAME') + machine = d.getVar("MACHINE") + + objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) + + with manifest_path.open("r") as f: + manifest = json.load(f) + + builds = [] + for task in manifest: + imagetype = task["imagetype"] + taskname = task["taskname"] + + image_build = objset.add_root(objset.new_task_build(taskname, "image/%s" % imagetype)) + set_timestamp_now(d, image_build, "build_buildEndTime") + builds.append(image_build) + + artifacts = [] + + for image in task["images"]: + image_filename = image["filename"] + image_path = image_deploy_dir / image_filename + a = objset.add_root(oe.spdx30.software_File( + _id=objset.new_spdxid("image", image_filename), + creationInfo=objset.doc.creationInfo, + name=image_filename, + verifiedUsing=[ + oe.spdx30.Hash( + algorithm=oe.spdx30.HashAlgorithm.sha256, + hashValue=bb.utils.sha256_file(image_path), + ) + ] + )) + set_purposes(d, a, "SPDX_IMAGE_PURPOSE:%s" % imagetype, "SPDX_IMAGE_PURPOSE") + set_timestamp_now(d, a, "builtTime") + + artifacts.append(a) + + if artifacts: + objset.new_scoped_relationship( + [image_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + artifacts, + ) + + if builds: + rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "rootfs", + "%s-%s" % (image_basename, machine), + oe.spdx30.software_Package, + # TODO: Should use a purpose to filter here? + ) + objset.new_scoped_relationship( + builds, + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs_image._id], + ) + + objset.add_aliases() + objset.link() + oe.sbom30.write_recipe_jsonld_doc(d, objset, "image", spdx_work_dir) +} +addtask do_create_image_spdx after do_image_complete do_create_rootfs_spdx before do_build +SSTATETASKS += "do_create_image_spdx" +SSTATE_SKIP_CREATION:task-combine-image-type-spdx = "1" +do_create_image_spdx[sstate-inputdirs] = "${SPDXIMAGEWORK}" +do_create_image_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" +do_create_image_spdx[cleandirs] = "${SPDXIMAGEWORK}" +do_create_image_spdx[dirs] = "${SPDXIMAGEWORK}" + +python do_create_image_spdx_setscene() { + sstate_setscene(d) +} +addtask do_create_image_spdx_setscene + + +python do_create_image_sbom_spdx() { + import os + from pathlib import Path + import oe.spdx30 + import oe.sbom30 + + image_name = d.getVar("IMAGE_NAME") + image_basename = d.getVar("IMAGE_BASENAME") + image_link_name = d.getVar("IMAGE_LINK_NAME") + imgdeploydir = Path(d.getVar("SPDXIMAGEDEPLOYDIR")) + machine = d.getVar("MACHINE") + + spdx_path = imgdeploydir / (image_name + ".spdx.json") + + root_elements = [] + + # TODO: Do we need to add the rootfs or are the image files sufficient? + rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "rootfs", + "%s-%s" % (image_basename, machine), + oe.spdx30.software_Package, + # TODO: Should use a purpose here? + ) + root_elements.append(rootfs_image._id) + + image_objset, _ = oe.sbom30.find_jsonld(d, "image", "%s-%s" % (image_basename, machine), required=True) + for o in image_objset.foreach_root(oe.spdx30.software_File): + root_elements.append(o._id) + + objset, sbom = oe.sbom30.create_sbom(d, image_name, root_elements) + + oe.sbom30.write_jsonld_doc(d, objset, spdx_path) + + def make_image_link(target_path, suffix): + if image_link_name: + link = imgdeploydir / (image_link_name + suffix) + if link != target_path: + link.symlink_to(os.path.relpath(target_path, link.parent)) + + make_image_link(spdx_path, ".spdx.json") +} +addtask do_create_image_sbom_spdx after do_create_rootfs_spdx do_create_image_spdx before do_build +SSTATETASKS += "do_create_image_sbom_spdx" +SSTATE_SKIP_CREATION:task-create-image-sbom = "1" +do_create_image_sbom_spdx[sstate-inputdirs] = "${SPDXIMAGEDEPLOYDIR}" +do_create_image_sbom_spdx[sstate-outputdirs] = "${DEPLOY_DIR_IMAGE}" +do_create_image_sbom_spdx[stamp-extra-info] = "${MACHINE_ARCH}" +do_create_image_sbom_spdx[cleandirs] = "${SPDXIMAGEDEPLOYDIR}" +do_create_image_sbom_spdx[recrdeptask] += "do_create_spdx do_create_package_spdx" + +python do_create_image_sbom_spdx_setscene() { + sstate_setscene(d) +} +addtask do_create_image_sbom_spdx_setscene + +do_populate_sdk[recrdeptask] += "do_create_spdx do_create_package_spdx" +do_populate_sdk[cleandirs] += "${SPDXSDKWORK}" +do_populate_sdk[postfuncs] += "sdk_create_sbom" +POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_create_spdx" +POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_create_spdx" + +do_populate_sdk_ext[recrdeptask] += "do_create_spdx do_create_package_spdx" +do_populate_sdk_ext[cleandirs] += "${SPDXSDKEXTWORK}" +do_populate_sdk_ext[postfuncs] += "sdk_ext_create_sbom" +POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk-ext = " sdk_ext_host_create_spdx" +POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk-ext = " sdk_ext_target_create_spdx" + +python sdk_host_create_spdx() { + from pathlib import Path + spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) + + sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) +} + +python sdk_target_create_spdx() { + from pathlib import Path + spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) + + sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) +} + +python sdk_ext_host_create_spdx() { + from pathlib import Path + spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) + + # TODO: This doesn't seem to work + sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) +} + +python sdk_ext_target_create_spdx() { + from pathlib import Path + spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) + + # TODO: This doesn't seem to work + sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) +} + +def sdk_create_spdx(d, sdk_type, spdx_work_dir, toolchain_outputname): + from pathlib import Path + from oe.sdk import sdk_list_installed_packages + import oe.spdx30 + import oe.sbom30 + from datetime import datetime + + sdk_name = toolchain_outputname + "-" + sdk_type + sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") + + objset = oe.sbom30.ObjectSet.new_objset(d, sdk_name) + + sdk_rootfs = objset.add_root(oe.spdx30.software_Package( + _id=objset.new_spdxid("sdk-rootfs", sdk_name), + creationInfo=objset.doc.creationInfo, + name=sdk_name, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, + )) + set_timestamp_now(d, sdk_rootfs, "builtTime") + + sdk_build = objset.add_root(objset.new_task_build("sdk-rootfs", "sdk-rootfs")) + set_timestamp_now(d, sdk_build, "build_buildEndTime") + + objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [sdk_rootfs], + ) + + collect_build_package_inputs(d, objset, sdk_build, sdk_packages) + + objset.add_aliases() + oe.sbom30.write_jsonld_doc(d, objset, spdx_work_dir / "sdk-rootfs.spdx.json") + +python sdk_create_sbom() { + from pathlib import Path + sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) + spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) + + create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) +} + +python sdk_ext_create_sbom() { + from pathlib import Path + sdk_deploydir = Path(d.getVar("SDKEXTDEPLOYDIR")) + spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) + + create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) +} + +def create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, toolchain_outputname): + import oe.spdx30 + import oe.sbom30 + from pathlib import Path + from datetime import datetime + + # Load the document written earlier + rootfs_objset = oe.sbom30.load_jsonld(d, spdx_work_dir / "sdk-rootfs.spdx.json", required=True) + + # Create a new build for the SDK installer + sdk_build = rootfs_objset.new_task_build("sdk-populate", "sdk-populate") + set_timestamp_now(d, sdk_build, "build_buildEndTime") + + rootfs = rootfs_objset.find_root(oe.spdx30.software_Package) + if rootfs is None: + bb.fatal("Unable to find rootfs artifact") + + rootfs_objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs] + ) + + files = set() + root_files = [] + + # NOTE: os.walk() doesn't return symlinks + for dirpath, dirnames, filenames in os.walk(sdk_deploydir): + for fn in filenames: + fpath = Path(dirpath) / fn + if not fpath.is_file() or fpath.is_symlink(): + continue + + relpath = str(fpath.relative_to(sdk_deploydir)) + + f = rootfs_objset.new_file( + rootfs_objset.new_spdxid("sdk-installer", relpath), + relpath, + fpath, + ) + set_timestamp_now(d, f, "builtTime") + + if fn.endswith(".manifest"): + f.software_primaryPurpose = oe.spdx30.software_SoftwarePurpose.manifest + elif fn.endswith(".testdata.json"): + f.software_primaryPurpose = oe.spdx30.software_SoftwarePurpose.configuration + else: + set_purposes(d, f, "SPDX_SDK_PURPOSE") + root_files.append(f) + + files.add(f) + + if files: + rootfs_objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + files, + ) + else: + bb.warn(f"No SDK output files found in {sdk_deploydir}") + + objset, sbom = oe.sbom30.create_sbom(d, toolchain_outputname, sorted(list(files)), [rootfs_objset]) + + oe.sbom30.write_jsonld_doc(d, objset, sdk_deploydir / (toolchain_outputname + ".spdx.json")) + diff --git a/meta/classes/spdx-common.bbclass b/meta/classes/spdx-common.bbclass index 18254c36aa4..6dfc1fd9e4c 100644 --- a/meta/classes/spdx-common.bbclass +++ b/meta/classes/spdx-common.bbclass @@ -17,6 +17,7 @@ SPDXDEPLOY = "${SPDXDIR}/deploy" SPDXWORK = "${SPDXDIR}/work" SPDXIMAGEWORK = "${SPDXDIR}/image-work" SPDXSDKWORK = "${SPDXDIR}/sdk-work" +SPDXSDKEXTWORK = "${SPDXDIR}/sdk-ext-work" SPDXDEPS = "${SPDXDIR}/deps.json" SPDX_TOOL_NAME ??= "oe-spdx-creator" @@ -61,7 +62,7 @@ def get_json_indent(d): return 2 return None -python() { +def load_spdx_license_data(d): import json if d.getVar("SPDX_LICENSE_DATA"): return @@ -71,6 +72,9 @@ python() { # Transform the license array to a dictionary data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} d.setVar("SPDX_LICENSE_DATA", data) + +python() { + load_spdx_license_data(d) } def process_sources(d): diff --git a/meta/lib/oe/sbom30.py b/meta/lib/oe/sbom30.py new file mode 100644 index 00000000000..771e87be796 --- /dev/null +++ b/meta/lib/oe/sbom30.py @@ -0,0 +1,1138 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +from pathlib import Path + +import oe.spdx30 +import bb +import re +import hashlib +import uuid +import os +from datetime import datetime, timezone + +OE_SPDX_BASE = "https://rdf.openembedded.org/spdx/3.0/" + +VEX_VERSION = "1.0.0" + +SPDX_BUILD_TYPE = "http://openembedded.org/bitbake" + + +@oe.spdx30.register(OE_SPDX_BASE + "link-extension") +class OELinkExtension(oe.spdx30.extension_Extension): + """ + This custom extension controls if an Element creates a symlink based on + its SPDX ID in the deploy directory. Some elements may not be able to be + linked because they are duplicated in multiple documents (e.g. the bitbake + Build Element). Those elements can add this extension and set link_spdx_id + to False + + It is in internal extension that should be removed when writing out a final + SBoM + """ + + CLOSED = True + INTERNAL = True + + @classmethod + def _register_props(cls): + super()._register_props() + cls._add_property( + "link_spdx_id", + oe.spdx30.BooleanProp(), + OE_SPDX_BASE + "link-spdx-id", + min_count=1, + max_count=1, + ) + + # The symlinks written to the deploy directory are based on the hash of + # the SPDX ID. While this makes it easy to look them up, it can be + # difficult to trace a Element to the hashed symlink name. As a + # debugging aid, this property is set to the basename of the symlink + # when the symlink is created to make it easier to trace + cls._add_property( + "link_name", + oe.spdx30.StringProp(), + OE_SPDX_BASE + "link-name", + max_count=1, + ) + + +@oe.spdx30.register(OE_SPDX_BASE + "id-alias") +class OEIdAliasExtension(oe.spdx30.extension_Extension): + """ + This extension allows an Element to provide an internal alias for the SPDX + ID. Since SPDX requires unique URIs for each SPDX ID, most of the objects + created have a unique UUID namespace and the unihash of the task encoded in + their SPDX ID. However, this causes a problem for referencing documents + across recipes, since the taskhash of a dependency may not factor into the + taskhash of the current task and thus the current task won't rebuild and + see the new SPDX ID when the dependency changes (e.g. ABI safe recipes and + tasks). + + To help work around this, this extension provides a non-unique alias for an + Element by which it can be referenced from other tasks/recipes. When a + final SBoM is created, references to these aliases will be replaced with + the actual unique SPDX ID. + + Most Elements will automatically get an alias created when they are written + out if they do not already have one. To suppress the creation of an alias, + add an extension with a blank `alias` property. + + + It is in internal extension that should be removed when writing out a final + SBoM + """ + + CLOSED = True + INTERNAL = True + + @classmethod + def _register_props(cls): + super()._register_props() + cls._add_property( + "alias", + oe.spdx30.StringProp(), + OE_SPDX_BASE + "alias", + max_count=1, + ) + + cls._add_property( + "link_name", + oe.spdx30.StringProp(), + OE_SPDX_BASE + "link-name", + max_count=1, + ) + + +@oe.spdx30.register(OE_SPDX_BASE + "file-name-alias") +class OEFileNameAliasExtension(oe.spdx30.extension_Extension): + CLOSED = True + INTERNAL = True + + @classmethod + def _register_props(cls): + super()._register_props() + cls._add_property( + "aliases", + oe.spdx30.ListProp(oe.spdx30.StringProp()), + OE_SPDX_BASE + "filename-alias", + ) + + +@oe.spdx30.register(OE_SPDX_BASE + "license-scanned") +class OELicenseScannedExtension(oe.spdx30.extension_Extension): + """ + The presence of this extension means the file has already been scanned for + license information + """ + + CLOSED = True + INTERNAL = True + + +@oe.spdx30.register(OE_SPDX_BASE + "document-extension") +class OEDocumentExtension(oe.spdx30.extension_Extension): + """ + This extension is added to a SpdxDocument to indicate various useful bits + of information about its contents + """ + + CLOSED = True + + @classmethod + def _register_props(cls): + super()._register_props() + cls._add_property( + "is_native", + oe.spdx30.BooleanProp(), + OE_SPDX_BASE + "is-native", + max_count=1, + ) + + +def spdxid_hash(*items): + h = hashlib.md5() + for i in items: + if isinstance(i, oe.spdx30.Element): + h.update(i._id.encode("utf-8")) + else: + h.update(i.encode("utf-8")) + return h.hexdigest() + + +def spdx_sde(d): + sde = d.getVar("SOURCE_DATE_EPOCH") + if not sde: + return datetime.now(timezone.utc) + + return datetime.fromtimestamp(int(sde), timezone.utc) + + +def get_element_link_id(e): + """ + Get the string ID which should be used to link to an Element. If the + element has an alias, that will be preferred, otherwise its SPDX ID will be + used. + """ + ext = get_alias(e) + if ext is not None and ext.alias: + return ext.alias + return e._id + + +def set_alias(obj, alias): + for ext in obj.extension: + if not isinstance(ext, OEIdAliasExtension): + continue + ext.alias = alias + return ext + + ext = OEIdAliasExtension(alias=alias) + obj.extension.append(ext) + return ext + + +def get_alias(obj): + for ext in obj.extension: + if not isinstance(ext, OEIdAliasExtension): + continue + return ext + + return None + + +def extract_licenses(filename): + lic_regex = re.compile( + rb"^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$", re.MULTILINE + ) + + try: + with open(filename, "rb") as f: + size = min(15000, os.stat(filename).st_size) + txt = f.read(size) + licenses = re.findall(lic_regex, txt) + if licenses: + ascii_licenses = [lic.decode("ascii") for lic in licenses] + return ascii_licenses + except Exception as e: + bb.warn(f"Exception reading {filename}: {e}") + return [] + + +def to_list(l): + if isinstance(l, set): + l = sorted(list(l)) + + if not isinstance(l, (list, tuple)): + raise TypeError("Must be a list or tuple. Got %s" % type(l)) + + return l + + +class ObjectSet(oe.spdx30.SHACLObjectSet): + def __init__(self, d): + super().__init__() + self.d = d + + def create_index(self): + self.by_sha256_hash = {} + super().create_index() + + def add_index(self, obj): + # Check that all elements are given an ID before being inserted + if isinstance(obj, oe.spdx30.Element): + if not obj._id: + raise ValueError("Element missing ID") + for ext in obj.extension: + if not isinstance(ext, OEIdAliasExtension): + continue + if ext.alias: + self.obj_by_id[ext.alias] = obj + + for v in obj.verifiedUsing: + if not isinstance(v, oe.spdx30.Hash): + continue + + if v.algorithm == oe.spdx30.HashAlgorithm.sha256: + continue + + self.by_sha256_hash.setdefault(v.hashValue, set()).add(obj) + + super().add_index(obj) + if isinstance(obj, oe.spdx30.SpdxDocument): + self.doc = obj + + def __filter_obj(self, obj, attr_filter): + return all(getattr(obj, k) == v for k, v in attr_filter.items()) + + def foreach_filter(self, typ, *, match_subclass=True, **attr_filter): + for obj in self.foreach_type(typ, match_subclass=match_subclass): + if self.__filter_obj(obj, attr_filter): + yield obj + + def find_filter(self, typ, *, match_subclass=True, **attr_filter): + for obj in self.foreach_filter( + typ, match_subclass=match_subclass, **attr_filter + ): + return obj + return None + + def foreach_root(self, typ, **attr_filter): + for obj in self.doc.rootElement: + if not isinstance(obj, typ): + continue + + if self.__filter_obj(obj, attr_filter): + yield obj + + def find_root(self, typ, **attr_filter): + for obj in self.foreach_root(typ, **attr_filter): + return obj + return None + + def add_root(self, obj): + self.add(obj) + self.doc.rootElement.append(obj) + return obj + + def is_native(self): + for e in self.doc.extension: + if not isinstance(e, oe.sbom30.OEDocumentExtension): + continue + + if e.is_native is not None: + return e.is_native + + return False + + def set_is_native(self, is_native): + for e in self.doc.extension: + if not isinstance(e, oe.sbom30.OEDocumentExtension): + continue + + e.is_native = is_native + return + + if is_native: + self.doc.extension.append(oe.sbom30.OEDocumentExtension(is_native=True)) + + def add_aliases(self): + for o in self.foreach_type(oe.spdx30.Element): + if not o._id or o._id.startswith("_:"): + continue + + alias_ext = get_alias(o) + if alias_ext is None: + unihash = self.d.getVar("BB_UNIHASH") + namespace = self.get_namespace() + if unihash not in o._id: + bb.warn(f"Unihash {unihash} not found in {o._id}") + elif namespace not in o._id: + bb.warn(f"Namespace {namespace} not found in {o._id}") + else: + alias_ext = set_alias( + o, + o._id.replace(unihash, "UNIHASH").replace( + namespace, self.d.getVar("PN") + ), + ) + + def remove_internal_extensions(self): + def remove(o): + o.extension = [e for e in o.extension if not getattr(e, "INTERNAL", False)] + + for o in self.foreach_type(oe.spdx30.Element): + remove(o) + + if self.doc: + remove(self.doc) + + def get_namespace(self): + namespace_uuid = uuid.uuid5( + uuid.NAMESPACE_DNS, self.d.getVar("SPDX_UUID_NAMESPACE") + ) + pn = self.d.getVar("PN") + return "%s/%s-%s" % ( + self.d.getVar("SPDX_NAMESPACE_PREFIX"), + pn, + str(uuid.uuid5(namespace_uuid, pn)), + ) + + def new_spdxid(self, *suffix, include_unihash=True): + items = [self.get_namespace()] + if include_unihash: + unihash = self.d.getVar("BB_UNIHASH") + items.append(unihash) + items.extend(re.sub(r"[^a-zA-Z0-9_-]", "_", s) for s in suffix) + return "/".join(items) + + def new_import(self, key): + base = f"SPDX_IMPORTS_{key}" + spdxid = self.d.getVar(f"{base}_spdxid") + if not spdxid: + bb.fatal(f"{key} is not a valid SPDX_IMPORTS key") + + for i in self.docs.imports: + if i.externalSpdxId == spdxid: + # Already imported + return spdxid + + m = oe.spdx30.ExternalMap(externalSpdxId=spdxid) + + uri = self.d.getVar(f"{base}_uri") + if uri: + m.locationHint = uri + + for pyname, algorithm in oe.spdx30.HashAlgorithm.NAMED_INDIVIDUALS.items(): + value = self.d.getVar(f"{base}_hash_{pyname}") + if value: + m.verifiedUsing.append( + oe.spdx30.Hash( + algorithm=algorithm, + hashValue=value, + ) + ) + + self.doc.imports.append(m) + return spdxid + + def new_agent(self, varname, *, creation_info=None, add=True): + ref_varname = self.d.getVar(f"{varname}_ref") + if ref_varname: + if ref_varname == varname: + bb.fatal(f"{varname} cannot reference itself") + return new_agent(varname, creation_info=creation_info) + + import_key = self.d.getVar(f"{varname}_import") + if import_key: + return self.new_import(import_key) + + name = self.d.getVar(f"{varname}_name") + if not name: + return None + + spdxid = self.new_spdxid("agent", name) + agent = self.find_by_id(spdxid) + if agent is not None: + return agent + + agent_type = self.d.getVar("%s_type" % varname) + if agent_type == "person": + agent = oe.spdx30.Person() + elif agent_type == "software": + agent = oe.spdx30.SoftwareAgent() + elif agent_type == "organization": + agent = oe.spdx30.Organization() + elif not agent_type or agent_type == "agent": + agent = oe.spdx30.Agent() + else: + bb.fatal("Unknown agent type '%s' in %s_type" % (agent_type, varname)) + + agent._id = spdxid + agent.creationInfo = creation_info or self.doc.creationInfo + agent.name = name + + comment = self.d.getVar("%s_comment" % varname) + if comment: + agent.comment = comment + + for ( + pyname, + idtype, + ) in oe.spdx30.ExternalIdentifierType.NAMED_INDIVIDUALS.items(): + value = self.d.getVar("%s_id_%s" % (varname, pyname)) + if value: + agent.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=idtype, + identifier=value, + ) + ) + + if add: + self.add(agent) + + return agent + + def new_creation_info(self): + creation_info = oe.spdx30.CreationInfo() + + name = "%s %s" % ( + self.d.getVar("SPDX_TOOL_NAME"), + self.d.getVar("SPDX_TOOL_VERSION"), + ) + tool = self.add( + oe.spdx30.Tool( + _id=self.new_spdxid("tool", name), + creationInfo=creation_info, + name=name, + ) + ) + + authors = [] + for a in self.d.getVar("SPDX_AUTHORS").split(): + varname = "SPDX_AUTHORS_%s" % a + author = self.new_agent(varname, creation_info=creation_info) + + if not author: + bb.fatal("Unable to find or create author %s" % a) + + authors.append(author) + + creation_info.created = spdx_sde(self.d) + creation_info.specVersion = self.d.getVar("SPDX_VERSION") + creation_info.createdBy = authors + creation_info.createdUsing = [tool] + + return creation_info + + def copy_creation_info(self, copy): + c = oe.spdx30.CreationInfo( + created=spdx_sde(self.d), + specVersion=self.d.getVar("SPDX_VERSION"), + ) + + for author in copy.createdBy: + if isinstance(author, str): + c.createdBy.append(author) + else: + c.createdBy.append(author._id) + + for tool in copy.createdUsing: + if isinstance(tool, str): + c.createdUsing.append(tool) + else: + c.createdUsing.append(tool._id) + + return c + + def new_annotation(self, subject, comment, typ): + return self.add( + oe.spdx30.Annotation( + _id=self.new_spdxid("annotation", spdxid_hash(comment, typ)), + creationInfo=self.doc.creationInfo, + annotationType=typ, + subject=subject, + statement=comment, + ) + ) + + def _new_relationship( + self, + cls, + from_, + typ, + to, + *, + spdxid_name="relationship", + **props, + ): + from_ = to_list(from_) + to = to_list(to) + + if not from_: + return [] + + if not to: + # TODO: Switch to the code constant once SPDX 3.0.1 is released + to = ["https://spdx.org/rdf/3.0.0/terms/Core/NoneElement"] + + ret = [] + + for f in from_: + hash_args = [typ, f] + for k in sorted(props.keys()): + hash_args.append(props[k]) + hash_args.extend(to) + + relationship = self.add( + cls( + _id=self.new_spdxid(spdxid_name, spdxid_hash(*hash_args)), + creationInfo=self.doc.creationInfo, + from_=f, + relationshipType=typ, + to=to, + **props, + ) + ) + ret.append(relationship) + + return ret + + def new_relationship(self, from_, typ, to): + return self._new_relationship(oe.spdx30.Relationship, from_, typ, to) + + def new_scoped_relationship(self, from_, typ, scope, to): + return self._new_relationship( + oe.spdx30.LifecycleScopedRelationship, + from_, + typ, + to, + scope=scope, + ) + + def new_license_expression(self, license_expression, license_text_map={}): + license_list_version = self.d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + # SPDX 3 requires that the license list version be a semver + # MAJOR.MINOR.MICRO, but the actual license version might be + # MAJOR.MINOR on some older versions. As such, manually append a .0 + # micro version if its missing to keep SPDX happy + if license_list_version.count(".") < 2: + license_list_version += ".0" + + spdxid = [ + "license", + license_list_version, + re.sub(r"[^a-zA-Z0-9_-]", "_", license_expression), + ] + + license_text = ( + (k, license_text_map[k]) for k in sorted(license_text_map.keys()) + ) + + if not license_text: + lic = self.find_filter( + oe.spdx30.simplelicensing_LicenseExpression, + simplelicensing_licenseExpression=license_expression, + simplelicensing_licenseListVersion=license_list_version, + ) + if lic is not None: + return lic + else: + spdxid.append(spdxid_hash(*(v for _, v in license_text))) + lic = self.find_by_id(self.new_spdxid(*spdxid)) + if lic is not None: + return lic + + lic = self.add( + oe.spdx30.simplelicensing_LicenseExpression( + _id=self.new_spdxid(*spdxid), + creationInfo=self.doc.creationInfo, + simplelicensing_licenseExpression=license_expression, + simplelicensing_licenseListVersion=license_list_version, + ) + ) + + for key, value in license_text: + lic.simplelicensing_customIdToUri.append( + oe.spdx30.DictionaryEntry(key=key, value=value) + ) + + return lic + + def scan_declared_licenses(self, spdx_file, filepath): + for e in spdx_file.extension: + if isinstance(e, OELicenseScannedExtension): + return + + file_licenses = set() + for extracted_lic in extract_licenses(filepath): + file_licenses.add(self.new_license_expression(extracted_lic)) + + self.new_relationship( + [spdx_file], + oe.spdx30.RelationshipType.hasDeclaredLicense, + file_licenses, + ) + spdx_file.extension.append(OELicenseScannedExtension()) + + def new_file(self, _id, name, path, *, purposes=[]): + sha256_hash = bb.utils.sha256_file(path) + + for f in self.by_sha256_hash.get(sha256_hash, []): + if not isinstance(oe.spdx30.software_File): + continue + + if purposes: + new_primary = purposes[0] + new_additional = [] + + if f.software_primaryPurpose: + new_additional.append(f.software_primaryPurpose) + new_additional.extend(f.software_additionalPurpose) + + new_additional = sorted( + list(set(p for p in new_additional if p != new_primary)) + ) + + f.software_primaryPurpose = new_primary + f.software_additionalPurpose = new_additional + + if f.name != name: + for e in f.extension: + if isinstance(e, OEFileNameAliasExtension): + e.aliases.append(name) + break + else: + f.extension.append(OEFileNameAliasExtension(aliases=[name])) + + return f + + spdx_file = oe.spdx30.software_File( + _id=_id, + creationInfo=self.doc.creationInfo, + name=name, + ) + if purposes: + spdx_file.software_primaryPurpose = purposes[0] + spdx_file.software_additionalPurpose = purposes[1:] + + spdx_file.verifiedUsing.append( + oe.spdx30.Hash( + algorithm=oe.spdx30.HashAlgorithm.sha256, + hashValue=sha256_hash, + ) + ) + + return self.add(spdx_file) + + def new_cve_vuln(self, cve): + v = oe.spdx30.security_Vulnerability() + v._id = self.new_spdxid("vulnerability", cve) + v.creationInfo = self.doc.creationInfo + + v.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.cve, + identifier=cve, + identifierLocator=[ + f"https://cveawg.mitre.org/api/cve/{cve}", + f"https://www.cve.org/CVERecord?id={cve}", + ], + ) + ) + return self.add(v) + + def new_vex_patched_relationship(self, from_, to): + return self._new_relationship( + oe.spdx30.security_VexFixedVulnAssessmentRelationship, + from_, + oe.spdx30.RelationshipType.fixedIn, + to, + spdxid_name="vex-fixed", + security_vexVersion=VEX_VERSION, + ) + + def new_vex_unpatched_relationship(self, from_, to): + return self._new_relationship( + oe.spdx30.security_VexAffectedVulnAssessmentRelationship, + from_, + oe.spdx30.RelationshipType.affects, + to, + spdxid_name="vex-affected", + security_vexVersion=VEX_VERSION, + ) + + def new_vex_ignored_relationship(self, from_, to, *, impact_statement): + return self._new_relationship( + oe.spdx30.security_VexNotAffectedVulnAssessmentRelationship, + from_, + oe.spdx30.RelationshipType.doesNotAffect, + to, + spdxid_name="vex-not-affected", + security_vexVersion=VEX_VERSION, + security_impactStatement=impact_statement, + ) + + def import_bitbake_build_objset(self): + deploy_dir_spdx = Path(self.d.getVar("DEPLOY_DIR_SPDX")) + bb_objset = load_jsonld( + self.d, deploy_dir_spdx / "bitbake.spdx.json", required=True + ) + self.doc.imports.extend(bb_objset.doc.imports) + self.update(bb_objset.objects) + + return bb_objset + + def import_bitbake_build(self): + def find_bitbake_build(objset): + return objset.find_filter( + oe.spdx30.build_Build, + build_buildType=SPDX_BUILD_TYPE, + ) + + build = find_bitbake_build(self) + if build: + return build + + bb_objset = self.import_bitbake_build_objset() + build = find_bitbake_build(bb_objset) + if build is None: + bb.fatal(f"No build found in {deploy_dir_spdx}") + + return build + + def new_task_build(self, name, typ): + current_task = self.d.getVar("BB_CURRENTTASK") + pn = self.d.getVar("PN") + + build = self.add( + oe.spdx30.build_Build( + _id=self.new_spdxid("build", name), + creationInfo=self.doc.creationInfo, + name=f"{pn}:do_{current_task}:{name}", + build_buildType=f"{SPDX_BUILD_TYPE}/do_{current_task}/{typ}", + ) + ) + + if self.d.getVar("SPDX_INCLUDE_BITBAKE_PARENT_BUILD") == "1": + bitbake_build = self.import_bitbake_build() + + self.new_relationship( + [bitbake_build], + oe.spdx30.RelationshipType.ancestorOf, + [build], + ) + + if self.d.getVar("SPDX_INCLUDE_BUILD_VARIABLES") == "1": + for varname in sorted(self.d.keys()): + if varname.startswith("__"): + continue + + value = self.d.getVar(varname, expand=False) + + # TODO: Deal with non-string values + if not isinstance(value, str): + continue + + build.parameters.append( + oe.spdx30.DictionaryEntry(key=varname, value=value) + ) + + return build + + def new_archive(self, archive_name): + return self.add( + oe.spdx30.software_File( + _id=self.new_spdxid("archive", str(archive_name)), + creationInfo=self.doc.creationInfo, + name=str(archive_name), + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, + ) + ) + + @classmethod + def new_objset(cls, d, name, copy_from_bitbake_doc=True): + objset = cls(d) + + document = oe.spdx30.SpdxDocument( + _id=objset.new_spdxid("document", name), + name=name, + ) + document.extension.append(OEIdAliasExtension()) + document.extension.append(OELinkExtension(link_spdx_id=False)) + objset.doc = document + + if copy_from_bitbake_doc: + bb_objset = objset.import_bitbake_build_objset() + document.creationInfo = objset.copy_creation_info( + bb_objset.doc.creationInfo + ) + else: + document.creationInfo = objset.new_creation_info() + + return objset + + def expand_collection(self, *, add_objectsets=[]): + """ + Expands a collection to pull in all missing elements + + Returns the set of ids that could not be found to link into the document + """ + missing_spdxids = set() + imports = {e.externalSpdxId: e for e in self.doc.imports} + + def merge_doc(other): + nonlocal imports + + for e in other.doc.imports: + if not e.externalSpdxId in imports: + imports[e.externalSpdxId] = e + + self.objects |= other.objects + + for o in add_objectsets: + merge_doc(o) + + needed_spdxids = self.link() + provided_spdxids = set(self.obj_by_id.keys()) + + while True: + import_spdxids = set(imports.keys()) + searching_spdxids = ( + needed_spdxids - provided_spdxids - missing_spdxids - import_spdxids + ) + if not searching_spdxids: + break + + spdxid = searching_spdxids.pop() + bb.debug( + 1, + f"Searching for {spdxid}. Remaining: {len(searching_spdxids)}, Total: {len(provided_spdxids)}, Missing: {len(missing_spdxids)}, Imports: {len(import_spdxids)}", + ) + dep_objset, dep_path = find_by_spdxid(self.d, spdxid) + + if dep_objset: + dep_provided = set(dep_objset.obj_by_id.keys()) + if spdxid not in dep_provided: + bb.fatal(f"{spdxid} not found in {dep_path}") + provided_spdxids |= dep_provided + needed_spdxids |= dep_objset.missing_ids + merge_doc(dep_objset) + else: + missing_spdxids.add(spdxid) + + bb.debug(1, "Linking...") + missing = self.link() + if missing != missing_spdxids: + bb.fatal( + f"Linked document doesn't match missing SPDX ID list. Got: {missing}\nExpected: {missing_spdxids}" + ) + + self.doc.imports = sorted(imports.values(), key=lambda e: e.externalSpdxId) + + return missing_spdxids + + +def load_jsonld(d, path, required=False): + deserializer = oe.spdx30.JSONLDDeserializer() + objset = ObjectSet(d) + try: + with path.open("rb") as f: + deserializer.read(f, objset) + except FileNotFoundError: + if required: + bb.fatal("No SPDX document named %s found" % path) + return None + + if not objset.doc: + bb.fatal("SPDX Document %s has no SPDXDocument element" % path) + return None + + objset.objects.remove(objset.doc) + return objset + + +def jsonld_arch_path(d, arch, subdir, name, deploydir=None): + if deploydir is None: + deploydir = Path(d.getVar("DEPLOY_DIR_SPDX")) + return deploydir / arch / subdir / (name + ".spdx.json") + + +def jsonld_hash_path(_id): + h = hashlib.sha256(_id.encode("utf-8")).hexdigest() + + return Path("by-spdxid-hash") / h[:2], h + + +def load_jsonld_by_arch(d, arch, subdir, name, *, required=False): + path = jsonld_arch_path(d, arch, subdir, name) + objset = load_jsonld(d, path, required=required) + if objset is not None: + return (objset, path) + return (None, None) + + +def find_jsonld(d, subdir, name, *, required=False): + package_archs = d.getVar("SSTATE_ARCHS").split() + package_archs.reverse() + + for arch in package_archs: + objset, path = load_jsonld_by_arch(d, arch, subdir, name) + if objset is not None: + return (objset, path) + + if required: + bb.fatal("Could not find a %s SPDX document named %s" % (subdir, name)) + + return (None, None) + + +def write_jsonld_doc(d, objset, dest): + if not isinstance(objset, ObjectSet): + bb.fatal("Only an ObjsetSet can be serialized") + return + + if not objset.doc: + bb.fatal("ObjectSet is missing a SpdxDocument") + return + + objset.doc.rootElement = sorted(list(set(objset.doc.rootElement))) + objset.doc.profileConformance = sorted( + list( + getattr(oe.spdx30.ProfileIdentifierType, p) + for p in d.getVar("SPDX_PROFILES").split() + ) + ) + + dest.parent.mkdir(exist_ok=True, parents=True) + + if d.getVar("SPDX_PRETTY") == "1": + serializer = oe.spdx30.JSONLDSerializer( + indent=2, + ) + else: + serializer = oe.spdx30.JSONLDInlineSerializer() + + objset.objects.add(objset.doc) + with dest.open("wb") as f: + serializer.write(objset, f, force_at_graph=True) + objset.objects.remove(objset.doc) + + +def write_recipe_jsonld_doc( + d, + objset, + subdir, + deploydir, + *, + create_spdx_id_links=True, +): + pkg_arch = d.getVar("SSTATE_PKGARCH") + + dest = jsonld_arch_path(d, pkg_arch, subdir, objset.doc.name, deploydir=deploydir) + + def link_id(_id): + hash_path = jsonld_hash_path(_id) + + link_name = jsonld_arch_path( + d, + pkg_arch, + *hash_path, + deploydir=deploydir, + ) + try: + link_name.parent.mkdir(exist_ok=True, parents=True) + link_name.symlink_to(os.path.relpath(dest, link_name.parent)) + except: + target = link_name.readlink() + bb.warn( + f"Unable to link {_id} in {dest} as {link_name}. Already points to {target}" + ) + raise + + return hash_path[-1] + + objset.add_aliases() + + try: + if create_spdx_id_links: + for o in objset.foreach_type(oe.spdx30.Element): + if not o._id or o._id.startswith("_:"): + continue + + ext = None + for e in o.extension: + if not isinstance(e, OELinkExtension): + continue + + ext = e + break + + if ext is None: + ext = OELinkExtension(link_spdx_id=True) + o.extension.append(ext) + + if ext.link_spdx_id: + ext.link_name = link_id(o._id) + + alias_ext = get_alias(o) + if alias_ext is not None and alias_ext.alias: + alias_ext.link_name = link_id(alias_ext.alias) + + finally: + # It is really helpful for debugging if the JSON document is written + # out, so always do that even if there is an error making the links + write_jsonld_doc(d, objset, dest) + + +def find_root_obj_in_jsonld(d, subdir, fn_name, obj_type, **attr_filter): + objset, fn = find_jsonld(d, subdir, fn_name, required=True) + + spdx_obj = objset.find_root(obj_type, **attr_filter) + if not spdx_obj: + bb.fatal("No root %s found in %s" % (obj_type.__name__, fn)) + + return spdx_obj, objset + + +def load_obj_in_jsonld(d, arch, subdir, fn_name, obj_type, **attr_filter): + objset, fn = load_jsonld_by_arch(d, arch, subdir, fn_name, required=True) + + spdx_obj = objset.find_filter(obj_type, **attr_filter) + if not spdx_obj: + bb.fatal("No %s found in %s" % (obj_type.__name__, fn)) + + return spdx_obj, objset + + +def find_by_spdxid(d, spdxid, *, required=False): + return find_jsonld(d, *jsonld_hash_path(spdxid), required=required) + + +def create_sbom(d, name, root_elements, add_objectsets=[]): + objset = ObjectSet.new_objset(d, name) + + sbom = objset.add( + oe.spdx30.software_Sbom( + _id=objset.new_spdxid("sbom", name), + name=name, + creationInfo=objset.doc.creationInfo, + software_sbomType=[oe.spdx30.software_SbomType.build], + rootElement=root_elements, + ) + ) + + missing_spdxids = objset.expand_collection(add_objectsets=add_objectsets) + if missing_spdxids: + bb.warn( + "The following SPDX IDs were unable to be resolved:\n " + + "\n ".join(sorted(list(missing_spdxids))) + ) + + # Filter out internal extensions from final SBoMs + objset.remove_internal_extensions() + + # SBoM should be the only root element of the document + objset.doc.rootElement = [sbom] + + # De-duplicate licenses + unique = set() + dedup = {} + for lic in objset.foreach_type(oe.spdx30.simplelicensing_LicenseExpression): + for u in unique: + if ( + u.simplelicensing_licenseExpression + == lic.simplelicensing_licenseExpression + and u.simplelicensing_licenseListVersion + == lic.simplelicensing_licenseListVersion + ): + dedup[lic] = u + break + else: + unique.add(lic) + + if dedup: + for rel in objset.foreach_filter( + oe.spdx30.Relationship, + relationshipType=oe.spdx30.RelationshipType.hasDeclaredLicense, + ): + rel.to = [dedup.get(to, to) for to in rel.to] + + for rel in objset.foreach_filter( + oe.spdx30.Relationship, + relationshipType=oe.spdx30.RelationshipType.hasConcludedLicense, + ): + rel.to = [dedup.get(to, to) for to in rel.to] + + for k, v in dedup.items(): + bb.debug(1, f"Removing duplicate License {k._id} -> {v._id}") + objset.objects.remove(k) + + objset.create_index() + + return objset, sbom diff --git a/meta/lib/oe/spdx30.py b/meta/lib/oe/spdx30.py new file mode 100644 index 00000000000..ae74ce36f46 --- /dev/null +++ b/meta/lib/oe/spdx30.py @@ -0,0 +1,6020 @@ +#! /usr/bin/env python3 +# +# Generated Python bindings from a SHACL model +# +# This file was automatically generated by shacl2code. DO NOT MANUALLY MODIFY IT +# +# SPDX-License-Identifier: MIT + +import functools +import hashlib +import json +import re +import sys +import threading +import time +from contextlib import contextmanager +from datetime import datetime, timezone, timedelta +from enum import Enum +from abc import ABC, abstractmethod + + +def check_type(obj, types): + if not isinstance(obj, types): + if isinstance(types, (list, tuple)): + raise TypeError( + f"Value must be one of type: {', '.join(t.__name__ for t in types)}. Got {type(obj)}" + ) + raise TypeError(f"Value must be of type {types.__name__}. Got {type(obj)}") + + +class Property(ABC): + """ + A generic SHACL object property. The different types will derive from this + class + """ + + def __init__(self, *, pattern=None): + self.pattern = pattern + + def init(self): + return None + + def validate(self, value): + check_type(value, self.VALID_TYPES) + if self.pattern is not None and not re.search( + self.pattern, self.to_string(value) + ): + raise ValueError( + f"Value is not correctly formatted. Got '{self.to_string(value)}'" + ) + + def set(self, value): + return value + + def check_min_count(self, value, min_count): + return min_count == 1 + + def check_max_count(self, value, max_count): + return max_count == 1 + + def elide(self, value): + return value is None + + def walk(self, value, callback, path): + callback(value, path) + + def iter_objects(self, value, recursive, visited): + return [] + + def link_prop(self, value, objectset, missing, visited): + return value + + def to_string(self, value): + return str(value) + + @abstractmethod + def encode(self, encoder, value, state): + pass + + @abstractmethod + def decode(self, decoder, *, objectset=None): + pass + + +class StringProp(Property): + """ + A scalar string property for an SHACL object + """ + + VALID_TYPES = str + + def set(self, value): + return str(value) + + def encode(self, encoder, value, state): + encoder.write_string(value) + + def decode(self, decoder, *, objectset=None): + return decoder.read_string() + + +class AnyURIProp(StringProp): + def encode(self, encoder, value, state): + encoder.write_iri(value) + + def decode(self, decoder, *, objectset=None): + return decoder.read_iri() + + +class DateTimeProp(Property): + """ + A Date/Time Object with optional timezone + """ + + VALID_TYPES = datetime + UTC_FORMAT_STR = "%Y-%m-%dT%H:%M:%SZ" + REGEX = r"^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(Z|[+-]\d{2}:\d{2})?$" + + def set(self, value): + return self._normalize(value) + + def encode(self, encoder, value, state): + encoder.write_datetime(self.to_string(value)) + + def decode(self, decoder, *, objectset=None): + s = decoder.read_datetime() + if s is None: + return None + v = self.from_string(s) + return self._normalize(v) + + def _normalize(self, value): + if value.utcoffset() is None: + value = value.astimezone() + offset = value.utcoffset() + if offset % timedelta(minutes=1): + offset = offset - (offset % timedelta(minutes=1)) + value = value.replace(tzinfo=timezone(offset)) + value = value.replace(microsecond=0) + return value + + def to_string(self, value): + value = self._normalize(value) + if value.tzinfo == timezone.utc: + return value.strftime(self.UTC_FORMAT_STR) + return value.isoformat() + + def from_string(self, value): + if not re.match(self.REGEX, value): + raise ValueError(f"'{value}' is not a correctly formatted datetime") + if "Z" in value: + d = datetime( + *(time.strptime(value, self.UTC_FORMAT_STR)[0:6]), + tzinfo=timezone.utc, + ) + else: + d = datetime.fromisoformat(value) + + return self._normalize(d) + + +class DateTimeStampProp(DateTimeProp): + """ + A Date/Time Object with required timestamp + """ + + REGEX = r"^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(Z|[+-]\d{2}:\d{2})$" + + +class IntegerProp(Property): + VALID_TYPES = int + + def set(self, value): + return int(value) + + def encode(self, encoder, value, state): + encoder.write_integer(value) + + def decode(self, decoder, *, objectset=None): + return decoder.read_integer() + + +class PositiveIntegerProp(IntegerProp): + def validate(self, value): + super().validate(value) + if value < 1: + raise ValueError(f"Value must be >=1. Got {value}") + + +class NonNegativeIntegerProp(IntegerProp): + def validate(self, value): + super().validate(value) + if value < 0: + raise ValueError(f"Value must be >= 0. Got {value}") + + +class BooleanProp(Property): + VALID_TYPES = bool + + def set(self, value): + return bool(value) + + def encode(self, encoder, value, state): + encoder.write_bool(value) + + def decode(self, decoder, *, objectset=None): + return decoder.read_bool() + + +class FloatProp(Property): + VALID_TYPES = (float, int) + + def set(self, value): + return float(value) + + def encode(self, encoder, value, state): + encoder.write_float(value) + + def decode(self, decoder, *, objectset=None): + return decoder.read_float() + + +class ObjectProp(Property): + """ + A scalar SHACL object property of a SHACL object + """ + + def __init__(self, cls, required): + super().__init__() + self.cls = cls + self.required = required + + def init(self): + if self.required and not self.cls.IS_ABSTRACT: + return self.cls() + return None + + def validate(self, value): + check_type(value, (self.cls, str)) + + def walk(self, value, callback, path): + if value is None: + return + + if not isinstance(value, str): + value.walk(callback, path) + else: + callback(value, path) + + def iter_objects(self, value, recursive, visited): + if value is None or isinstance(value, str): + return + + if value not in visited: + visited.add(value) + yield value + + if recursive: + for c in value.iter_objects(recursive=True, visited=visited): + yield c + + def encode(self, encoder, value, state): + if value is None: + raise ValueError("Object cannot be None") + + if isinstance(value, str): + value = _NI_ENCODE_CONTEXT.get(value, value) + encoder.write_iri(value) + return + + return value.encode(encoder, state) + + def decode(self, decoder, *, objectset=None): + iri = decoder.read_iri() + if iri is None: + return self.cls.decode(decoder, objectset=objectset) + + iri = _NI_DECODE_CONTEXT.get(iri, iri) + + if objectset is None: + return iri + + obj = objectset.find_by_id(iri) + if obj is None: + return iri + + self.validate(obj) + return obj + + def link_prop(self, value, objectset, missing, visited): + if value is None: + return value + + if isinstance(value, str): + o = objectset.find_by_id(value) + if o is not None: + self.validate(o) + return o + + if missing is not None: + missing.add(value) + + return value + + # De-duplicate IDs + if value._id: + value = objectset.find_by_id(value._id, value) + self.validate(value) + + value.link_helper(objectset, missing, visited) + return value + + +class ListProxy(object): + def __init__(self, prop, data=None): + if data is None: + self.__data = [] + else: + self.__data = data + self.__prop = prop + + def append(self, value): + self.__prop.validate(value) + self.__data.append(self.__prop.set(value)) + + def insert(self, idx, value): + self.__prop.validate(value) + self.__data.insert(idx, self.__prop.set(value)) + + def extend(self, items): + for i in items: + self.append(i) + + def sort(self, *args, **kwargs): + self.__data.sort(*args, **kwargs) + + def __getitem__(self, key): + return self.__data[key] + + def __setitem__(self, key, value): + if isinstance(key, slice): + for v in value: + self.__prop.validate(v) + self.__data[key] = [self.__prop.set(v) for v in value] + else: + self.__prop.validate(value) + self.__data[key] = self.__prop.set(value) + + def __delitem__(self, key): + del self.__data[key] + + def __contains__(self, item): + return item in self.__data + + def __iter__(self): + return iter(self.__data) + + def __len__(self): + return len(self.__data) + + def __str__(self): + return str(self.__data) + + def __repr__(self): + return repr(self.__data) + + def __eq__(self, other): + if isinstance(other, ListProxy): + return self.__data == other.__data + + return self.__data == other + + +class ListProp(Property): + """ + A list of SHACL properties + """ + + VALID_TYPES = (list, ListProxy) + + def __init__(self, prop): + super().__init__() + self.prop = prop + + def init(self): + return ListProxy(self.prop) + + def validate(self, value): + super().validate(value) + + for i in value: + self.prop.validate(i) + + def set(self, value): + if isinstance(value, ListProxy): + return value + + return ListProxy(self.prop, [self.prop.set(d) for d in value]) + + def check_min_count(self, value, min_count): + check_type(value, ListProxy) + return len(value) >= min_count + + def check_max_count(self, value, max_count): + check_type(value, ListProxy) + return len(value) <= max_count + + def elide(self, value): + check_type(value, ListProxy) + return len(value) == 0 + + def walk(self, value, callback, path): + callback(value, path) + for idx, v in enumerate(value): + self.prop.walk(v, callback, path + [f"[{idx}]"]) + + def iter_objects(self, value, recursive, visited): + for v in value: + for c in self.prop.iter_objects(v, recursive, visited): + yield c + + def link_prop(self, value, objectset, missing, visited): + if isinstance(value, ListProxy): + data = [self.prop.link_prop(v, objectset, missing, visited) for v in value] + else: + data = [self.prop.link_prop(v, objectset, missing, visited) for v in value] + + return ListProxy(self.prop, data=data) + + def encode(self, encoder, value, state): + check_type(value, ListProxy) + + with encoder.write_list() as list_s: + for v in value: + with list_s.write_list_item() as item_s: + self.prop.encode(item_s, v, state) + + def decode(self, decoder, *, objectset=None): + data = [] + for val_d in decoder.read_list(): + v = self.prop.decode(val_d, objectset=objectset) + self.prop.validate(v) + data.append(v) + + return ListProxy(self.prop, data=data) + + +class EnumProp(Property): + VALID_TYPES = str + + def __init__(self, values, *, pattern=None): + super().__init__(pattern=pattern) + self.values = values + + def validate(self, value): + super().validate(value) + + valid_values = (iri for iri, _ in self.values) + if value not in valid_values: + raise ValueError( + f"'{value}' is not a valid value. Choose one of {' '.join(valid_values)}" + ) + + def encode(self, encoder, value, state): + for iri, compact in self.values: + if iri == value: + encoder.write_enum(value, self, compact) + return + + encoder.write_enum(value, self) + + def decode(self, decoder, *, objectset=None): + v = decoder.read_enum(self) + for iri, compact in self.values: + if v == compact: + return iri + return v + + +class NodeKind(Enum): + BlankNode = 1 + IRI = 2 + BlankNodeOrIRI = 3 + + +def is_IRI(s): + if not isinstance(s, str): + return False + if s.startswith("_:"): + return False + if ":" not in s: + return False + return True + + +def is_blank_node(s): + if not isinstance(s, str): + return False + if not s.startswith("_:"): + return False + return True + + +def register(type_iri, *, compact_type=None, abstract=False): + def add_class(key, c): + assert ( + key not in SHACLObject.CLASSES + ), f"{key} already registered to {SHACLObject.CLASSES[key].__name__}" + SHACLObject.CLASSES[key] = c + + def decorator(c): + global NAMED_INDIVIDUALS + + assert issubclass( + c, SHACLObject + ), f"{c.__name__} is not derived from SHACLObject" + + c._OBJ_TYPE = type_iri + c.IS_ABSTRACT = abstract + add_class(type_iri, c) + + c._OBJ_COMPACT_TYPE = compact_type + if compact_type: + add_class(compact_type, c) + + NAMED_INDIVIDUALS |= set(c.NAMED_INDIVIDUALS.values()) + + # Registration is deferred until the first instance of class is created + # so that it has access to any other defined class + c._NEEDS_REG = True + return c + + return decorator + + +register_lock = threading.Lock() +NAMED_INDIVIDUALS = set() + + +@functools.total_ordering +class SHACLObject(object): + CLASSES = {} + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = None + IS_ABSTRACT = True + + def __init__(self, **kwargs): + if self._is_abstract(): + raise NotImplementedError( + f"{self.__class__.__name__} is abstract and cannot be implemented" + ) + + with register_lock: + cls = self.__class__ + if cls._NEEDS_REG: + cls._OBJ_PROPERTIES = {} + cls._OBJ_IRIS = {} + cls._register_props() + cls._NEEDS_REG = False + + self.__dict__["_obj_data"] = {} + self.__dict__["_obj_metadata"] = {} + + for iri, prop, _, _, _, _ in self.__iter_props(): + self.__dict__["_obj_data"][iri] = prop.init() + + for k, v in kwargs.items(): + setattr(self, k, v) + + def _is_abstract(self): + return self.__class__.IS_ABSTRACT + + @classmethod + def _register_props(cls): + cls._add_property("_id", StringProp(), iri="@id") + + @classmethod + def _add_property( + cls, + pyname, + prop, + iri, + min_count=None, + max_count=None, + compact=None, + ): + if pyname in cls._OBJ_IRIS: + raise KeyError(f"'{pyname}' is already defined for '{cls.__name__}'") + if iri in cls._OBJ_PROPERTIES: + raise KeyError(f"'{iri}' is already defined for '{cls.__name__}'") + + while hasattr(cls, pyname): + pyname = pyname + "_" + + pyname = sys.intern(pyname) + iri = sys.intern(iri) + + cls._OBJ_IRIS[pyname] = iri + cls._OBJ_PROPERTIES[iri] = (prop, min_count, max_count, pyname, compact) + + def __setattr__(self, name, value): + if name == self.ID_ALIAS: + self["@id"] = value + return + + try: + iri = self._OBJ_IRIS[name] + self[iri] = value + except KeyError: + raise AttributeError( + f"'{name}' is not a valid property of {self.__class__.__name__}" + ) + + def __getattr__(self, name): + if name in self._OBJ_IRIS: + return self.__dict__["_obj_data"][self._OBJ_IRIS[name]] + + if name == self.ID_ALIAS: + return self.__dict__["_obj_data"]["@id"] + + if name == "_metadata": + return self.__dict__["_obj_metadata"] + + if name == "_IRI": + return self._OBJ_IRIS + + if name == "TYPE": + return self.__class__._OBJ_TYPE + + if name == "COMPACT_TYPE": + return self.__class__._OBJ_COMPACT_TYPE + + raise AttributeError( + f"'{name}' is not a valid property of {self.__class__.__name__}" + ) + + def __delattr__(self, name): + if name == self.ID_ALIAS: + del self["@id"] + return + + try: + iri = self._OBJ_IRIS[name] + del self[iri] + except KeyError: + raise AttributeError( + f"'{name}' is not a valid property of {self.__class__.__name__}" + ) + + def __get_prop(self, iri): + if iri not in self._OBJ_PROPERTIES: + raise KeyError( + f"'{iri}' is not a valid property of {self.__class__.__name__}" + ) + + return self._OBJ_PROPERTIES[iri] + + def __iter_props(self): + for iri, v in self._OBJ_PROPERTIES.items(): + yield iri, *v + + def __getitem__(self, iri): + return self.__dict__["_obj_data"][iri] + + def __setitem__(self, iri, value): + if iri == "@id": + if self.NODE_KIND == NodeKind.BlankNode: + if not is_blank_node(value): + raise ValueError( + f"{self.__class__.__name__} ({id(self)}) can only have local reference. Property '{iri}' cannot be set to '{value}' and must start with '_:'" + ) + elif self.NODE_KIND == NodeKind.IRI: + if not is_IRI(value): + raise ValueError( + f"{self.__class__.__name__} ({id(self)}) can only have an IRI value. Property '{iri}' cannot be set to '{value}'" + ) + else: + if not is_blank_node(value) and not is_IRI(value): + raise ValueError( + f"{self.__class__.__name__} ({id(self)}) Has invalid Property '{iri}' '{value}'. Must be a blank node or IRI" + ) + + prop, _, _, _, _ = self.__get_prop(iri) + prop.validate(value) + self.__dict__["_obj_data"][iri] = prop.set(value) + + def __delitem__(self, iri): + prop, _, _, _, _ = self.__get_prop(iri) + self.__dict__["_obj_data"][iri] = prop.init() + + def __iter__(self): + return self._OBJ_PROPERTIES.keys() + + def walk(self, callback, path=None): + """ + Walk object tree, invoking the callback for each item + + Callback has the form: + + def callback(object, path): + """ + if path is None: + path = ["."] + + if callback(self, path): + for iri, prop, _, _, _, _ in self.__iter_props(): + prop.walk(self.__dict__["_obj_data"][iri], callback, path + [f".{iri}"]) + + def property_keys(self): + for iri, _, _, _, pyname, compact in self.__iter_props(): + if iri == "@id": + compact = self.ID_ALIAS + yield pyname, iri, compact + + def iter_objects(self, *, recursive=False, visited=None): + """ + Iterate of all objects that are a child of this one + """ + if visited is None: + visited = set() + + for iri, prop, _, _, _, _ in self.__iter_props(): + for c in prop.iter_objects( + self.__dict__["_obj_data"][iri], recursive=recursive, visited=visited + ): + yield c + + def encode(self, encoder, state): + idname = self.ID_ALIAS or self._OBJ_IRIS["_id"] + if not self._id and self.NODE_KIND == NodeKind.IRI: + raise ValueError( + f"{self.__class__.__name__} ({id(self)}) must have a IRI for property '{idname}'" + ) + + if state.is_written(self): + encoder.write_iri(state.get_object_id(self)) + return + + state.add_written(self) + + with encoder.write_object( + self, + state.get_object_id(self), + bool(self._id) or state.is_refed(self), + ) as obj_s: + self._encode_properties(obj_s, state) + + def _encode_properties(self, encoder, state): + for iri, prop, min_count, max_count, pyname, compact in self.__iter_props(): + value = self.__dict__["_obj_data"][iri] + if prop.elide(value): + if min_count: + raise ValueError( + f"Property '{pyname}' in {self.__class__.__name__} ({id(self)}) is required (currently {value!r})" + ) + continue + + if min_count is not None: + if not prop.check_min_count(value, min_count): + raise ValueError( + f"Property '{pyname}' in {self.__class__.__name__} ({id(self)}) requires a minimum of {min_count} elements" + ) + + if max_count is not None: + if not prop.check_max_count(value, max_count): + raise ValueError( + f"Property '{pyname}' in {self.__class__.__name__} ({id(self)}) requires a maximum of {max_count} elements" + ) + + if iri == self._OBJ_IRIS["_id"]: + continue + + with encoder.write_property(iri, compact) as prop_s: + prop.encode(prop_s, value, state) + + @classmethod + def _make_object(cls, typ): + if typ not in cls.CLASSES: + raise TypeError(f"Unknown type {typ}") + + return cls.CLASSES[typ]() + + @classmethod + def decode(cls, decoder, *, objectset=None): + typ, obj_d = decoder.read_object() + if typ is None: + raise TypeError("Unable to determine type for object") + + obj = cls._make_object(typ) + for key in (obj.ID_ALIAS, obj._OBJ_IRIS["_id"]): + with obj_d.read_property(key) as prop_d: + if prop_d is None: + continue + + _id = prop_d.read_iri() + if _id is None: + raise TypeError(f"Object key '{key}' is the wrong type") + + obj._id = _id + break + + if obj.NODE_KIND == NodeKind.IRI and not obj._id: + raise ValueError("Object is missing required IRI") + + if objectset is not None: + if obj._id: + v = objectset.find_by_id(_id) + if v is not None: + return v + + obj._decode_properties(obj_d, objectset=objectset) + + if objectset is not None: + objectset.add_index(obj) + return obj + + def _decode_properties(self, decoder, objectset=None): + for key in decoder.object_keys(): + if not self._decode_prop(decoder, key, objectset=objectset): + raise KeyError(f"Unknown property '{key}'") + + def _decode_prop(self, decoder, key, objectset=None): + if key in (self._OBJ_IRIS["_id"], self.ID_ALIAS): + return True + + for iri, prop, _, _, _, compact in self.__iter_props(): + if compact == key: + read_key = compact + elif iri == key: + read_key = iri + else: + continue + + with decoder.read_property(read_key) as prop_d: + v = prop.decode(prop_d, objectset=objectset) + prop.validate(v) + self.__dict__["_obj_data"][iri] = v + return True + + return False + + def link_helper(self, objectset, missing, visited): + if self in visited: + return + + visited.add(self) + + for iri, prop, _, _, _, _ in self.__iter_props(): + self.__dict__["_obj_data"][iri] = prop.link_prop( + self.__dict__["_obj_data"][iri], + objectset, + missing, + visited, + ) + + def __str__(self): + parts = [ + f"{self.__class__.__name__}(", + ] + if self._id: + parts.append(f"@id='{self._id}'") + parts.append(")") + return "".join(parts) + + def __hash__(self): + return super().__hash__() + + def __eq__(self, other): + return super().__eq__(other) + + def __lt__(self, other): + def sort_key(obj): + if isinstance(obj, str): + return (obj, "", "", "") + return ( + obj._id or "", + obj.TYPE, + getattr(obj, "name", None) or "", + id(obj), + ) + + return sort_key(self) < sort_key(other) + + +class SHACLExtensibleObject(object): + CLOSED = False + + def __init__(self, typ=None, **kwargs): + if typ: + self.__dict__["_obj_TYPE"] = (typ, None) + else: + self.__dict__["_obj_TYPE"] = (self._OBJ_TYPE, self._OBJ_COMPACT_TYPE) + super().__init__(**kwargs) + + def _is_abstract(self): + # Unknown classes are assumed to not be abstract so that they can be + # deserialized + typ = self.__dict__["_obj_TYPE"][0] + if typ in self.__class__.CLASSES: + return self.__class__.CLASSES[typ].IS_ABSTRACT + + return False + + @classmethod + def _make_object(cls, typ): + # Check for a known type, and if so, deserialize as that instead + if typ in cls.CLASSES: + return cls.CLASSES[typ]() + + obj = cls(typ) + return obj + + def _decode_properties(self, decoder, objectset=None): + if self.CLOSED: + super()._decode_properties(decoder, objectset=objectset) + return + + for key in decoder.object_keys(): + if self._decode_prop(decoder, key, objectset=objectset): + continue + + if not is_IRI(key): + raise KeyError( + f"Extensible object properties must be IRIs. Got '{key}'" + ) + + with decoder.read_property(key) as prop_d: + self.__dict__["_obj_data"][key] = prop_d.read_value() + + def _encode_properties(self, encoder, state): + def encode_value(encoder, v): + if isinstance(v, bool): + encoder.write_bool(v) + elif isinstance(v, str): + encoder.write_string(v) + elif isinstance(v, int): + encoder.write_integer(v) + elif isinstance(v, float): + encoder.write_float(v) + else: + raise TypeError( + f"Unsupported serialized type {type(v)} with value '{v}'" + ) + + super()._encode_properties(encoder, state) + if self.CLOSED: + return + + for iri, value in self.__dict__["_obj_data"].items(): + if iri in self._OBJ_PROPERTIES: + continue + + with encoder.write_property(iri) as prop_s: + encode_value(prop_s, value) + + def __setitem__(self, iri, value): + try: + super().__setitem__(iri, value) + except KeyError: + if self.CLOSED: + raise + + if not is_IRI(iri): + raise KeyError(f"Key '{iri}' must be an IRI") + self.__dict__["_obj_data"][iri] = value + + def __delitem__(self, iri): + try: + super().__delitem__(iri) + except KeyError: + if self.CLOSED: + raise + + if not is_IRI(iri): + raise KeyError(f"Key '{iri}' must be an IRI") + del self.__dict__["_obj_data"][iri] + + def __getattr__(self, name): + if name == "TYPE": + return self.__dict__["_obj_TYPE"][0] + if name == "COMPACT_TYPE": + return self.__dict__["_obj_TYPE"][1] + return super().__getattr__(name) + + def property_keys(self): + iris = set() + for pyname, iri, compact in super().property_keys(): + iris.add(iri) + yield pyname, iri, compact + + if self.CLOSED: + return + + for iri in self.__dict__["_obj_data"].keys(): + if iri not in iris: + yield None, iri, None + + +class SHACLObjectSet(object): + def __init__(self, objects=[], *, link=False): + self.objects = set() + self.missing_ids = set() + for o in objects: + self.objects.add(o) + self.create_index() + if link: + self._link() + + def create_index(self): + """ + (re)Create object index + + Creates or recreates the indices for the object set to enable fast + lookup. All objects and their children are walked and indexed + """ + self.obj_by_id = {} + self.obj_by_type = {} + for o in self.foreach(): + self.add_index(o) + + def add_index(self, obj): + """ + Add object to index + + Adds the object to all appropriate indices + """ + + def reg_type(typ, compact, o, exact): + self.obj_by_type.setdefault(typ, set()).add((exact, o)) + if compact: + self.obj_by_type.setdefault(compact, set()).add((exact, o)) + + if not isinstance(obj, SHACLObject): + raise TypeError("Object is not of type SHACLObject") + + for typ in SHACLObject.CLASSES.values(): + if isinstance(obj, typ): + reg_type( + typ._OBJ_TYPE, typ._OBJ_COMPACT_TYPE, obj, obj.__class__ is typ + ) + + # This covers custom extensions + reg_type(obj.TYPE, obj.COMPACT_TYPE, obj, True) + + if not obj._id: + return + + self.missing_ids.discard(obj._id) + + if obj._id in self.obj_by_id: + return + + self.obj_by_id[obj._id] = obj + + def add(self, obj): + """ + Add object to object set + + Adds a SHACLObject to the object set and index it. + + NOTE: Child objects of the attached object are not indexes + """ + if not isinstance(obj, SHACLObject): + raise TypeError("Object is not of type SHACLObject") + + if obj not in self.objects: + self.objects.add(obj) + self.add_index(obj) + return obj + + def update(self, *others): + """ + Update object set adding all objects in each other iterable + """ + for o in others: + for obj in o: + self.add(obj) + + def __contains__(self, item): + """ + Returns True if the item is in the object set + """ + return item in self.objects + + def link(self): + """ + Link object set + + Links the object in the object set by replacing string object + references with references to the objects themselves. e.g. + a property that references object "https://foo/bar" by a string + reference will be replaced with an actual reference to the object in + the object set with the same ID if it exists in the object set + + If multiple objects with the same ID are found, the duplicates are + eliminated + """ + self.create_index() + return self._link() + + def _link(self): + global NAMED_INDIVIDUALS + + self.missing_ids = set() + visited = set() + + new_objects = set() + + for o in self.objects: + if o._id: + o = self.find_by_id(o._id, o) + o.link_helper(self, self.missing_ids, visited) + new_objects.add(o) + + self.objects = new_objects + + # Remove blank nodes + obj_by_id = {} + for _id, obj in self.obj_by_id.items(): + if _id.startswith("_:"): + del obj._id + else: + obj_by_id[_id] = obj + self.obj_by_id = obj_by_id + + # Named individuals aren't considered missing + self.missing_ids -= NAMED_INDIVIDUALS + + return self.missing_ids + + def find_by_id(self, _id, default=None): + """ + Find object by ID + + Returns objects that match the specified ID, or default if there is no + object with the specified ID + """ + if _id not in self.obj_by_id: + return default + return self.obj_by_id[_id] + + def foreach(self): + """ + Iterate over every object in the object set, and all child objects + """ + visited = set() + for o in self.objects: + if o not in visited: + yield o + visited.add(o) + + for child in o.iter_objects(recursive=True, visited=visited): + yield child + + def foreach_type(self, typ, *, match_subclass=True): + """ + Iterate over each object of a specified type (or subclass there of) + + If match_subclass is True, and class derived from typ will also match + (similar to isinstance()). If False, only exact matches will be + returned + """ + if not isinstance(typ, str): + if not issubclass(typ, SHACLObject): + raise TypeError(f"Type must be derived from SHACLObject, got {typ}") + typ = typ._OBJ_TYPE + + if typ not in self.obj_by_type: + return + + for exact, o in self.obj_by_type[typ]: + if match_subclass or exact: + yield o + + def merge(self, *objectsets): + """ + Merge object sets + + Returns a new object set that is the combination of this object set and + all provided arguments + """ + new_objects = set() + new_objects |= self.objects + for d in objectsets: + new_objects |= d.objects + + return SHACLObjectSet(new_objects, link=True) + + def encode(self, encoder, force_list=False): + """ + Serialize a list of objects to a serialization encoder + + If force_list is true, a list will always be written using the encoder. + """ + ref_counts = {} + state = EncodeState() + + def walk_callback(value, path): + nonlocal state + nonlocal ref_counts + + if not isinstance(value, SHACLObject): + return True + + # Remove blank node ID for re-assignment + if value._id and value._id.startswith("_:"): + del value._id + + if value._id: + state.add_refed(value) + + # If the object is referenced more than once, add it to the set of + # referenced objects + ref_counts.setdefault(value, 0) + ref_counts[value] += 1 + if ref_counts[value] > 1: + state.add_refed(value) + return False + + return True + + for o in self.objects: + if o._id: + state.add_refed(o) + o.walk(walk_callback) + + use_list = force_list or len(self.objects) > 1 + + if use_list: + # If we are making a list add all the objects referred to by reference + # to the list + objects = list(self.objects | state.ref_objects) + else: + objects = list(self.objects) + + objects.sort() + + if use_list: + # Ensure top level objects are only written in the top level graph + # node, and referenced by ID everywhere else. This is done by setting + # the flag that indicates this object has been written for all the top + # level objects, then clearing it right before serializing the object. + # + # In this way, if an object is referenced before it is supposed to be + # serialized into the @graph, it will serialize as a string instead of + # the actual object + for o in objects: + state.written_objects.add(o) + + with encoder.write_list() as list_s: + for o in objects: + # Allow this specific object to be written now + state.written_objects.remove(o) + with list_s.write_list_item() as item_s: + o.encode(item_s, state) + + else: + objects[0].encode(encoder, state) + + def decode(self, decoder): + self.create_index() + + for obj_d in decoder.read_list(): + o = SHACLObject.decode(obj_d, objectset=self) + self.objects.add(o) + + self._link() + + +class EncodeState(object): + def __init__(self): + self.ref_objects = set() + self.written_objects = set() + self.blank_objects = {} + + def get_object_id(self, o): + if o._id: + return o._id + + if o not in self.blank_objects: + _id = f"_:{o.__class__.__name__}{len(self.blank_objects)}" + self.blank_objects[o] = _id + + return self.blank_objects[o] + + def is_refed(self, o): + return o in self.ref_objects + + def add_refed(self, o): + self.ref_objects.add(o) + + def is_written(self, o): + return o in self.written_objects + + def add_written(self, o): + self.written_objects.add(o) + + +class Decoder(ABC): + @abstractmethod + def read_value(self): + """ + Consume next item + + Consumes the next item of any type + """ + pass + + @abstractmethod + def read_string(self): + """ + Consume the next item as a string. + + Returns the string value of the next item, or `None` if the next item + is not a string + """ + pass + + @abstractmethod + def read_datetime(self): + """ + Consumes the next item as a date & time string + + Returns the string value of the next item, if it is a ISO datetime, or + `None` if the next item is not a ISO datetime string. + + Note that validation of the string is done by the caller, so a minimal + implementation can just check if the next item is a string without + worrying about the format + """ + pass + + @abstractmethod + def read_integer(self): + """ + Consumes the next item as an integer + + Returns the integer value of the next item, or `None` if the next item + is not an integer + """ + pass + + @abstractmethod + def read_iri(self): + """ + Consumes the next item as an IRI string + + Returns the string value of the next item an IRI, or `None` if the next + item is not an IRI. + + The returned string should be either a fully-qualified IRI, or a blank + node ID + """ + pass + + @abstractmethod + def read_enum(self, e): + """ + Consumes the next item as an Enum value string + + Returns the fully qualified IRI of the next enum item, or `None` if the + next item is not an enum value. + + The callee is responsible for validating that the returned IRI is + actually a member of the specified Enum, so the `Decoder` does not need + to check that, but can if it wishes + """ + pass + + @abstractmethod + def read_bool(self): + """ + Consume the next item as a boolean value + + Returns the boolean value of the next item, or `None` if the next item + is not a boolean + """ + pass + + @abstractmethod + def read_float(self): + """ + Consume the next item as a float value + + Returns the float value of the next item, or `None` if the next item is + not a float + """ + pass + + @abstractmethod + def read_list(self): + """ + Consume the next item as a list generator + + This should generate a `Decoder` object for each item in the list. The + generated `Decoder` can be used to read the corresponding item from the + list + """ + pass + + @abstractmethod + def read_object(self): + """ + Consume next item as an object + + A context manager that "enters" the next item as a object and yields a + `Decoder` that can read properties from it. If the next item is not an + object, yields `None` + + Properties will be read out of the object using `read_property` and + `read_object_id` + """ + pass + + @abstractmethod + @contextmanager + def read_property(self, key): + """ + Read property from object + + A context manager that yields a `Decoder` that can be used to read the + value of the property with the given key in current object, or `None` + if the property does not exist in the current object. + """ + pass + + @abstractmethod + def object_keys(self): + """ + Read property keys from an object + + Iterates over all the serialized keys for the current object + """ + pass + + @abstractmethod + def read_object_id(self, alias=None): + """ + Read current object ID property + + Returns the ID of the current object if one is defined, or `None` if + the current object has no ID. + + The ID must be a fully qualified IRI or a blank node + + If `alias` is provided, is is a hint as to another name by which the ID + might be found, if the `Decoder` supports aliases for an ID + """ + pass + + +class JSONLDDecoder(Decoder): + def __init__(self, data, root=False): + self.data = data + self.root = root + + def read_value(self): + if isinstance(self.data, str): + try: + return float(self.data) + except ValueError: + pass + return self.data + + def read_string(self): + if isinstance(self.data, str): + return self.data + return None + + def read_datetime(self): + return self.read_string() + + def read_integer(self): + if isinstance(self.data, int): + return self.data + return None + + def read_bool(self): + if isinstance(self.data, bool): + return self.data + return None + + def read_float(self): + if isinstance(self.data, (int, float, str)): + return float(self.data) + return None + + def read_iri(self): + if isinstance(self.data, str): + return self.data + return None + + def read_enum(self, e): + if isinstance(self.data, str): + return self.data + return None + + def read_list(self): + if isinstance(self.data, (list, tuple, set)): + for v in self.data: + yield self.__class__(v) + else: + yield self + + def __get_value(self, *keys): + for k in keys: + if k and k in self.data: + return self.data[k] + return None + + @contextmanager + def read_property(self, key): + v = self.__get_value(key) + if v is not None: + yield self.__class__(v) + else: + yield None + + def object_keys(self): + for key in self.data.keys(): + if key in ("@type", "type"): + continue + if self.root and key == "@context": + continue + yield key + + def read_object(self): + typ = self.__get_value("@type", "type") + if typ is not None: + return typ, self + + return None, self + + def read_object_id(self, alias=None): + return self.__get_value(alias, "@id") + + +class JSONLDDeserializer(object): + def deserialize_data(self, data, objectset: SHACLObjectSet): + if "@graph" in data: + h = JSONLDDecoder(data["@graph"], True) + else: + h = JSONLDDecoder(data, True) + + objectset.decode(h) + + def read(self, f, objectset: SHACLObjectSet): + data = json.load(f) + self.deserialize_data(data, objectset) + + +class Encoder(ABC): + @abstractmethod + def write_string(self, v): + """ + Write a string value + + Encodes the value as a string in the output + """ + pass + + @abstractmethod + def write_datetime(self, v): + """ + Write a date & time string + + Encodes the value as an ISO datetime string + + Note: The provided string is already correctly encoded as an ISO datetime + """ + pass + + @abstractmethod + def write_integer(self, v): + """ + Write an integer value + + Encodes the value as an integer in the output + """ + pass + + @abstractmethod + def write_iri(self, v, compact=None): + """ + Write IRI + + Encodes the string as an IRI. Note that the string will be either a + fully qualified IRI or a blank node ID. If `compact` is provided and + the serialization supports compacted IRIs, it should be preferred to + the full IRI + """ + pass + + @abstractmethod + def write_enum(self, v, e, compact=None): + """ + Write enum value IRI + + Encodes the string enum value IRI. Note that the string will be a fully + qualified IRI. If `compact` is provided and the serialization supports + compacted IRIs, it should be preferred to the full IRI. + """ + pass + + @abstractmethod + def write_bool(self, v): + """ + Write boolean + + Encodes the value as a boolean in the output + """ + pass + + @abstractmethod + def write_float(self, v): + """ + Write float + + Encodes the value as a floating point number in the output + """ + pass + + @abstractmethod + @contextmanager + def write_object(self, o, _id, needs_id): + """ + Write object + + A context manager that yields an `Encoder` that can be used to encode + the given object properties. + + The provided ID will always be a valid ID (even if o._id is `None`), in + case the `Encoder` _must_ have an ID. `needs_id` is a hint to indicate + to the `Encoder` if an ID must be written or not (if that is even an + option). If it is `True`, the `Encoder` must encode an ID for the + object. If `False`, the encoder is not required to encode an ID and may + omit it. + + The ID will be either a fully qualified IRI, or a blank node IRI. + + Properties will be written the object using `write_property` + """ + pass + + @abstractmethod + @contextmanager + def write_property(self, iri, compact=None): + """ + Write object property + + A context manager that yields an `Encoder` that can be used to encode + the value for the property with the given IRI in the current object + + Note that the IRI will be fully qualified. If `compact` is provided and + the serialization supports compacted IRIs, it should be preferred to + the full IRI. + """ + pass + + @abstractmethod + @contextmanager + def write_list(self): + """ + Write list + + A context manager that yields an `Encoder` that can be used to encode a + list. + + Each item of the list will be added using `write_list_item` + """ + pass + + @abstractmethod + @contextmanager + def write_list_item(self): + """ + Write list item + + A context manager that yields an `Encoder` that can be used to encode + the value for a list item + """ + pass + + +class JSONLDEncoder(Encoder): + def __init__(self, data=None): + self.data = data + + def write_string(self, v): + self.data = v + + def write_datetime(self, v): + self.data = v + + def write_integer(self, v): + self.data = v + + def write_iri(self, v, compact=None): + self.write_string(compact or v) + + def write_enum(self, v, e, compact=None): + self.write_string(compact or v) + + def write_bool(self, v): + self.data = v + + def write_float(self, v): + self.data = str(v) + + @contextmanager + def write_property(self, iri, compact=None): + s = self.__class__(None) + yield s + if s.data is not None: + self.data[compact or iri] = s.data + + @contextmanager + def write_object(self, o, _id, needs_id): + self.data = { + "type": o.COMPACT_TYPE or o.TYPE, + } + if needs_id: + self.data[o.ID_ALIAS or "@id"] = _id + yield self + + @contextmanager + def write_list(self): + self.data = [] + yield self + if not self.data: + self.data = None + + @contextmanager + def write_list_item(self): + s = self.__class__(None) + yield s + if s.data is not None: + self.data.append(s.data) + + +class JSONLDSerializer(object): + def __init__(self, **args): + self.args = args + + def serialize_data( + self, + objectset: SHACLObjectSet, + force_at_graph=False, + ): + h = JSONLDEncoder() + objectset.encode(h, force_at_graph) + data = {} + if len(CONTEXT_URLS) == 1: + data["@context"] = CONTEXT_URLS[0] + elif CONTEXT_URLS: + data["@context"] = CONTEXT_URLS + + if isinstance(h.data, list): + data["@graph"] = h.data + else: + for k, v in h.data.items(): + data[k] = v + + return data + + def write( + self, + objectset: SHACLObjectSet, + f, + force_at_graph=False, + **kwargs, + ): + """ + Write a SHACLObjectSet to a JSON LD file + + If force_at_graph is True, a @graph node will always be written + """ + data = self.serialize_data(objectset, force_at_graph) + + args = {**self.args, **kwargs} + + sha1 = hashlib.sha1() + for chunk in json.JSONEncoder(**args).iterencode(data): + chunk = chunk.encode("utf-8") + f.write(chunk) + sha1.update(chunk) + + return sha1.hexdigest() + + +class JSONLDInlineEncoder(Encoder): + def __init__(self, f, sha1): + self.f = f + self.comma = False + self.sha1 = sha1 + + def write(self, s): + s = s.encode("utf-8") + self.f.write(s) + self.sha1.update(s) + + def _write_comma(self): + if self.comma: + self.write(",") + self.comma = False + + def write_string(self, v): + self.write(json.dumps(v)) + + def write_datetime(self, v): + self.write_string(v) + + def write_integer(self, v): + self.write(f"{v}") + + def write_iri(self, v, compact=None): + self.write_string(compact or v) + + def write_enum(self, v, e, compact=None): + self.write_iri(v, compact) + + def write_bool(self, v): + if v: + self.write("true") + else: + self.write("false") + + def write_float(self, v): + self.write(json.dumps(str(v))) + + @contextmanager + def write_property(self, iri, compact=None): + self._write_comma() + self.write_string(compact or iri) + self.write(":") + yield self + self.comma = True + + @contextmanager + def write_object(self, o, _id, needs_id): + self._write_comma() + + self.write("{") + self.write_string("type") + self.write(":") + self.write_string(o.COMPACT_TYPE or o.TYPE) + self.comma = True + + if needs_id: + self._write_comma() + self.write_string(o.ID_ALIAS or "@id") + self.write(":") + self.write_string(_id) + self.comma = True + + self.comma = True + yield self + + self.write("}") + self.comma = True + + @contextmanager + def write_list(self): + self._write_comma() + self.write("[") + yield self.__class__(self.f, self.sha1) + self.write("]") + self.comma = True + + @contextmanager + def write_list_item(self): + self._write_comma() + yield self.__class__(self.f, self.sha1) + self.comma = True + + +class JSONLDInlineSerializer(object): + def write( + self, + objectset: SHACLObjectSet, + f, + force_at_graph=False, + ): + """ + Write a SHACLObjectSet to a JSON LD file + + Note: force_at_graph is included for compatibility, but ignored. This + serializer always writes out a graph + """ + sha1 = hashlib.sha1() + h = JSONLDInlineEncoder(f, sha1) + h.write('{"@context":') + if len(CONTEXT_URLS) == 1: + h.write(f'"{CONTEXT_URLS[0]}"') + elif CONTEXT_URLS: + h.write('["') + h.write('","'.join(CONTEXT_URLS)) + h.write('"]') + h.write(",") + + h.write('"@graph":') + + objectset.encode(h, True) + h.write("}") + return sha1.hexdigest() + + +def print_tree(objects, all_fields=False): + """ + Print object tree + """ + seen = set() + + def callback(value, path): + nonlocal seen + + s = (" " * (len(path) - 1)) + f"{path[-1]}" + if isinstance(value, SHACLObject): + s += f" {value} ({id(value)})" + is_empty = False + elif isinstance(value, ListProxy): + is_empty = len(value) == 0 + if is_empty: + s += " []" + else: + s += f" {value!r}" + is_empty = value is None + + if all_fields or not is_empty: + print(s) + + if isinstance(value, SHACLObject): + if value in seen: + return False + seen.add(value) + return True + + return True + + for o in objects: + o.walk(callback) + + +# fmt: off +"""Format Guard""" + + +CONTEXT_URLS = [ + "https://spdx.org/rdf/3.0.0/spdx-context.jsonld", +] + +_NI_ENCODE_CONTEXT = { + "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/kilowattHour": "ai_EnergyUnitType:kilowattHour", + "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/megajoule": "ai_EnergyUnitType:megajoule", + "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/other": "ai_EnergyUnitType:other", + "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/high": "ai_SafetyRiskAssessmentType:high", + "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/low": "ai_SafetyRiskAssessmentType:low", + "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/medium": "ai_SafetyRiskAssessmentType:medium", + "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/serious": "ai_SafetyRiskAssessmentType:serious", + "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/other": "AnnotationType:other", + "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/review": "AnnotationType:review", + "https://spdx.org/rdf/3.0.0/terms/Core/NoAssertionElement": "spdx:Core/NoAssertionElement", + "https://spdx.org/rdf/3.0.0/terms/Core/NoneElement": "spdx:Core/NoneElement", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe22": "ExternalIdentifierType:cpe22", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe23": "ExternalIdentifierType:cpe23", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cve": "ExternalIdentifierType:cve", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/email": "ExternalIdentifierType:email", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/gitoid": "ExternalIdentifierType:gitoid", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/other": "ExternalIdentifierType:other", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/packageUrl": "ExternalIdentifierType:packageUrl", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/securityOther": "ExternalIdentifierType:securityOther", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swhid": "ExternalIdentifierType:swhid", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swid": "ExternalIdentifierType:swid", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/urlScheme": "ExternalIdentifierType:urlScheme", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altDownloadLocation": "ExternalRefType:altDownloadLocation", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altWebPage": "ExternalRefType:altWebPage", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/binaryArtifact": "ExternalRefType:binaryArtifact", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/bower": "ExternalRefType:bower", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildMeta": "ExternalRefType:buildMeta", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildSystem": "ExternalRefType:buildSystem", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/certificationReport": "ExternalRefType:certificationReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/chat": "ExternalRefType:chat", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/componentAnalysisReport": "ExternalRefType:componentAnalysisReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/cwe": "ExternalRefType:cwe", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/documentation": "ExternalRefType:documentation", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/dynamicAnalysisReport": "ExternalRefType:dynamicAnalysisReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/eolNotice": "ExternalRefType:eolNotice", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/exportControlAssessment": "ExternalRefType:exportControlAssessment", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/funding": "ExternalRefType:funding", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/issueTracker": "ExternalRefType:issueTracker", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/license": "ExternalRefType:license", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mailingList": "ExternalRefType:mailingList", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mavenCentral": "ExternalRefType:mavenCentral", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/metrics": "ExternalRefType:metrics", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/npm": "ExternalRefType:npm", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/nuget": "ExternalRefType:nuget", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/other": "ExternalRefType:other", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/privacyAssessment": "ExternalRefType:privacyAssessment", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/productMetadata": "ExternalRefType:productMetadata", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/purchaseOrder": "ExternalRefType:purchaseOrder", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/qualityAssessmentReport": "ExternalRefType:qualityAssessmentReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseHistory": "ExternalRefType:releaseHistory", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseNotes": "ExternalRefType:releaseNotes", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/riskAssessment": "ExternalRefType:riskAssessment", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/runtimeAnalysisReport": "ExternalRefType:runtimeAnalysisReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/secureSoftwareAttestation": "ExternalRefType:secureSoftwareAttestation", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdversaryModel": "ExternalRefType:securityAdversaryModel", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdvisory": "ExternalRefType:securityAdvisory", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityFix": "ExternalRefType:securityFix", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityOther": "ExternalRefType:securityOther", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPenTestReport": "ExternalRefType:securityPenTestReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPolicy": "ExternalRefType:securityPolicy", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityThreatModel": "ExternalRefType:securityThreatModel", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/socialMedia": "ExternalRefType:socialMedia", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/sourceArtifact": "ExternalRefType:sourceArtifact", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/staticAnalysisReport": "ExternalRefType:staticAnalysisReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/support": "ExternalRefType:support", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vcs": "ExternalRefType:vcs", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityDisclosureReport": "ExternalRefType:vulnerabilityDisclosureReport", + "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityExploitabilityAssessment": "ExternalRefType:vulnerabilityExploitabilityAssessment", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256": "HashAlgorithm:blake2b256", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384": "HashAlgorithm:blake2b384", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512": "HashAlgorithm:blake2b512", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3": "HashAlgorithm:blake3", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium": "HashAlgorithm:crystalsDilithium", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber": "HashAlgorithm:crystalsKyber", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon": "HashAlgorithm:falcon", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2": "HashAlgorithm:md2", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4": "HashAlgorithm:md4", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5": "HashAlgorithm:md5", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6": "HashAlgorithm:md6", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other": "HashAlgorithm:other", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1": "HashAlgorithm:sha1", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224": "HashAlgorithm:sha224", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256": "HashAlgorithm:sha256", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384": "HashAlgorithm:sha384", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224": "HashAlgorithm:sha3_224", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256": "HashAlgorithm:sha3_256", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384": "HashAlgorithm:sha3_384", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512": "HashAlgorithm:sha3_512", + "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512": "HashAlgorithm:sha512", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/build": "LifecycleScopeType:build", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/design": "LifecycleScopeType:design", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/development": "LifecycleScopeType:development", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/other": "LifecycleScopeType:other", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/runtime": "LifecycleScopeType:runtime", + "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/test": "LifecycleScopeType:test", + "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no": "PresenceType:no", + "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion": "PresenceType:noAssertion", + "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes": "PresenceType:yes", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/ai": "ProfileIdentifierType:ai", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/build": "ProfileIdentifierType:build", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/core": "ProfileIdentifierType:core", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/dataset": "ProfileIdentifierType:dataset", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/expandedLicensing": "ProfileIdentifierType:expandedLicensing", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/extension": "ProfileIdentifierType:extension", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/lite": "ProfileIdentifierType:lite", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/security": "ProfileIdentifierType:security", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/simpleLicensing": "ProfileIdentifierType:simpleLicensing", + "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/software": "ProfileIdentifierType:software", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/complete": "RelationshipCompleteness:complete", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/incomplete": "RelationshipCompleteness:incomplete", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/noAssertion": "RelationshipCompleteness:noAssertion", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/affects": "RelationshipType:affects", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/amendedBy": "RelationshipType:amendedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/ancestorOf": "RelationshipType:ancestorOf", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/availableFrom": "RelationshipType:availableFrom", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/configures": "RelationshipType:configures", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/contains": "RelationshipType:contains", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/coordinatedBy": "RelationshipType:coordinatedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/copiedTo": "RelationshipType:copiedTo", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/delegatedTo": "RelationshipType:delegatedTo", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/dependsOn": "RelationshipType:dependsOn", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/descendantOf": "RelationshipType:descendantOf", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/describes": "RelationshipType:describes", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/doesNotAffect": "RelationshipType:doesNotAffect", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/expandsTo": "RelationshipType:expandsTo", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/exploitCreatedBy": "RelationshipType:exploitCreatedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedBy": "RelationshipType:fixedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedIn": "RelationshipType:fixedIn", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/foundBy": "RelationshipType:foundBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/generates": "RelationshipType:generates", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAddedFile": "RelationshipType:hasAddedFile", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssessmentFor": "RelationshipType:hasAssessmentFor", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssociatedVulnerability": "RelationshipType:hasAssociatedVulnerability", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasConcludedLicense": "RelationshipType:hasConcludedLicense", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDataFile": "RelationshipType:hasDataFile", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeclaredLicense": "RelationshipType:hasDeclaredLicense", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeletedFile": "RelationshipType:hasDeletedFile", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDependencyManifest": "RelationshipType:hasDependencyManifest", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDistributionArtifact": "RelationshipType:hasDistributionArtifact", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDocumentation": "RelationshipType:hasDocumentation", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDynamicLink": "RelationshipType:hasDynamicLink", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasEvidence": "RelationshipType:hasEvidence", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasExample": "RelationshipType:hasExample", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasHost": "RelationshipType:hasHost", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasInputs": "RelationshipType:hasInputs", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasMetadata": "RelationshipType:hasMetadata", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalComponent": "RelationshipType:hasOptionalComponent", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalDependency": "RelationshipType:hasOptionalDependency", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOutputs": "RelationshipType:hasOutputs", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasPrerequsite": "RelationshipType:hasPrerequsite", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasProvidedDependency": "RelationshipType:hasProvidedDependency", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasRequirement": "RelationshipType:hasRequirement", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasSpecification": "RelationshipType:hasSpecification", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasStaticLink": "RelationshipType:hasStaticLink", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTest": "RelationshipType:hasTest", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTestCase": "RelationshipType:hasTestCase", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasVariant": "RelationshipType:hasVariant", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/invokedBy": "RelationshipType:invokedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/modifiedBy": "RelationshipType:modifiedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/other": "RelationshipType:other", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/packagedBy": "RelationshipType:packagedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/patchedBy": "RelationshipType:patchedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/publishedBy": "RelationshipType:publishedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/reportedBy": "RelationshipType:reportedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/republishedBy": "RelationshipType:republishedBy", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/serializedInArtifact": "RelationshipType:serializedInArtifact", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/testedOn": "RelationshipType:testedOn", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/trainedOn": "RelationshipType:trainedOn", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/underInvestigationFor": "RelationshipType:underInvestigationFor", + "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/usesTool": "RelationshipType:usesTool", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/deployed": "SupportType:deployed", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/development": "SupportType:development", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/endOfSupport": "SupportType:endOfSupport", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/limitedSupport": "SupportType:limitedSupport", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noAssertion": "SupportType:noAssertion", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noSupport": "SupportType:noSupport", + "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/support": "SupportType:support", + "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/amber": "dataset_ConfidentialityLevelType:amber", + "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/clear": "dataset_ConfidentialityLevelType:clear", + "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/green": "dataset_ConfidentialityLevelType:green", + "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/red": "dataset_ConfidentialityLevelType:red", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/clickthrough": "dataset_DatasetAvailabilityType:clickthrough", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/directDownload": "dataset_DatasetAvailabilityType:directDownload", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/query": "dataset_DatasetAvailabilityType:query", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/registration": "dataset_DatasetAvailabilityType:registration", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/scrapingScript": "dataset_DatasetAvailabilityType:scrapingScript", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/audio": "dataset_DatasetType:audio", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/categorical": "dataset_DatasetType:categorical", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/graph": "dataset_DatasetType:graph", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/image": "dataset_DatasetType:image", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/noAssertion": "dataset_DatasetType:noAssertion", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/numeric": "dataset_DatasetType:numeric", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/other": "dataset_DatasetType:other", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/sensor": "dataset_DatasetType:sensor", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/structured": "dataset_DatasetType:structured", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/syntactic": "dataset_DatasetType:syntactic", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/text": "dataset_DatasetType:text", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timeseries": "dataset_DatasetType:timeseries", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timestamp": "dataset_DatasetType:timestamp", + "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/video": "dataset_DatasetType:video", + "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical": "security_CvssSeverityType:critical", + "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high": "security_CvssSeverityType:high", + "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low": "security_CvssSeverityType:low", + "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium": "security_CvssSeverityType:medium", + "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none": "security_CvssSeverityType:none", + "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/kev": "security_ExploitCatalogType:kev", + "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/other": "security_ExploitCatalogType:other", + "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/act": "security_SsvcDecisionType:act", + "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/attend": "security_SsvcDecisionType:attend", + "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/track": "security_SsvcDecisionType:track", + "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/trackStar": "security_SsvcDecisionType:trackStar", + "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/componentNotPresent": "security_VexJustificationType:componentNotPresent", + "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/inlineMitigationsAlreadyExist": "security_VexJustificationType:inlineMitigationsAlreadyExist", + "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeCannotBeControlledByAdversary": "security_VexJustificationType:vulnerableCodeCannotBeControlledByAdversary", + "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotInExecutePath": "security_VexJustificationType:vulnerableCodeNotInExecutePath", + "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotPresent": "security_VexJustificationType:vulnerableCodeNotPresent", + "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/gitoid": "software_ContentIdentifierType:gitoid", + "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/swhid": "software_ContentIdentifierType:swhid", + "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/directory": "software_FileKindType:directory", + "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/file": "software_FileKindType:file", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/analyzed": "software_SbomType:analyzed", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/build": "software_SbomType:build", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/deployed": "software_SbomType:deployed", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/design": "software_SbomType:design", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/runtime": "software_SbomType:runtime", + "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/source": "software_SbomType:source", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application": "software_SoftwarePurpose:application", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive": "software_SoftwarePurpose:archive", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom": "software_SoftwarePurpose:bom", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration": "software_SoftwarePurpose:configuration", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container": "software_SoftwarePurpose:container", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data": "software_SoftwarePurpose:data", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device": "software_SoftwarePurpose:device", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver": "software_SoftwarePurpose:deviceDriver", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage": "software_SoftwarePurpose:diskImage", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation": "software_SoftwarePurpose:documentation", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence": "software_SoftwarePurpose:evidence", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable": "software_SoftwarePurpose:executable", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file": "software_SoftwarePurpose:file", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage": "software_SoftwarePurpose:filesystemImage", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware": "software_SoftwarePurpose:firmware", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework": "software_SoftwarePurpose:framework", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install": "software_SoftwarePurpose:install", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library": "software_SoftwarePurpose:library", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest": "software_SoftwarePurpose:manifest", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model": "software_SoftwarePurpose:model", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module": "software_SoftwarePurpose:module", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem": "software_SoftwarePurpose:operatingSystem", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other": "software_SoftwarePurpose:other", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch": "software_SoftwarePurpose:patch", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform": "software_SoftwarePurpose:platform", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement": "software_SoftwarePurpose:requirement", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source": "software_SoftwarePurpose:source", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification": "software_SoftwarePurpose:specification", + "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test": "software_SoftwarePurpose:test", + "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoAssertionLicense": "spdx:ExpandedLicensing/NoAssertionLicense", + "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoneLicense": "spdx:ExpandedLicensing/NoneLicense", +} + +_NI_DECODE_CONTEXT = { + "ai_EnergyUnitType:kilowattHour": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/kilowattHour", + "ai_EnergyUnitType:megajoule": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/megajoule", + "ai_EnergyUnitType:other": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/other", + "ai_SafetyRiskAssessmentType:high": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/high", + "ai_SafetyRiskAssessmentType:low": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/low", + "ai_SafetyRiskAssessmentType:medium": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/medium", + "ai_SafetyRiskAssessmentType:serious": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/serious", + "AnnotationType:other": "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/other", + "AnnotationType:review": "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/review", + "spdx:Core/NoAssertionElement": "https://spdx.org/rdf/3.0.0/terms/Core/NoAssertionElement", + "spdx:Core/NoneElement": "https://spdx.org/rdf/3.0.0/terms/Core/NoneElement", + "ExternalIdentifierType:cpe22": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe22", + "ExternalIdentifierType:cpe23": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe23", + "ExternalIdentifierType:cve": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cve", + "ExternalIdentifierType:email": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/email", + "ExternalIdentifierType:gitoid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/gitoid", + "ExternalIdentifierType:other": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/other", + "ExternalIdentifierType:packageUrl": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/packageUrl", + "ExternalIdentifierType:securityOther": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/securityOther", + "ExternalIdentifierType:swhid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swhid", + "ExternalIdentifierType:swid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swid", + "ExternalIdentifierType:urlScheme": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/urlScheme", + "ExternalRefType:altDownloadLocation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altDownloadLocation", + "ExternalRefType:altWebPage": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altWebPage", + "ExternalRefType:binaryArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/binaryArtifact", + "ExternalRefType:bower": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/bower", + "ExternalRefType:buildMeta": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildMeta", + "ExternalRefType:buildSystem": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildSystem", + "ExternalRefType:certificationReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/certificationReport", + "ExternalRefType:chat": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/chat", + "ExternalRefType:componentAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/componentAnalysisReport", + "ExternalRefType:cwe": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/cwe", + "ExternalRefType:documentation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/documentation", + "ExternalRefType:dynamicAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/dynamicAnalysisReport", + "ExternalRefType:eolNotice": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/eolNotice", + "ExternalRefType:exportControlAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/exportControlAssessment", + "ExternalRefType:funding": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/funding", + "ExternalRefType:issueTracker": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/issueTracker", + "ExternalRefType:license": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/license", + "ExternalRefType:mailingList": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mailingList", + "ExternalRefType:mavenCentral": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mavenCentral", + "ExternalRefType:metrics": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/metrics", + "ExternalRefType:npm": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/npm", + "ExternalRefType:nuget": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/nuget", + "ExternalRefType:other": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/other", + "ExternalRefType:privacyAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/privacyAssessment", + "ExternalRefType:productMetadata": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/productMetadata", + "ExternalRefType:purchaseOrder": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/purchaseOrder", + "ExternalRefType:qualityAssessmentReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/qualityAssessmentReport", + "ExternalRefType:releaseHistory": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseHistory", + "ExternalRefType:releaseNotes": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseNotes", + "ExternalRefType:riskAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/riskAssessment", + "ExternalRefType:runtimeAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/runtimeAnalysisReport", + "ExternalRefType:secureSoftwareAttestation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/secureSoftwareAttestation", + "ExternalRefType:securityAdversaryModel": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdversaryModel", + "ExternalRefType:securityAdvisory": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdvisory", + "ExternalRefType:securityFix": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityFix", + "ExternalRefType:securityOther": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityOther", + "ExternalRefType:securityPenTestReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPenTestReport", + "ExternalRefType:securityPolicy": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPolicy", + "ExternalRefType:securityThreatModel": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityThreatModel", + "ExternalRefType:socialMedia": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/socialMedia", + "ExternalRefType:sourceArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/sourceArtifact", + "ExternalRefType:staticAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/staticAnalysisReport", + "ExternalRefType:support": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/support", + "ExternalRefType:vcs": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vcs", + "ExternalRefType:vulnerabilityDisclosureReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityDisclosureReport", + "ExternalRefType:vulnerabilityExploitabilityAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityExploitabilityAssessment", + "HashAlgorithm:blake2b256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256", + "HashAlgorithm:blake2b384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384", + "HashAlgorithm:blake2b512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512", + "HashAlgorithm:blake3": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3", + "HashAlgorithm:crystalsDilithium": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium", + "HashAlgorithm:crystalsKyber": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber", + "HashAlgorithm:falcon": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon", + "HashAlgorithm:md2": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2", + "HashAlgorithm:md4": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4", + "HashAlgorithm:md5": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5", + "HashAlgorithm:md6": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6", + "HashAlgorithm:other": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other", + "HashAlgorithm:sha1": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1", + "HashAlgorithm:sha224": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224", + "HashAlgorithm:sha256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256", + "HashAlgorithm:sha384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384", + "HashAlgorithm:sha3_224": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224", + "HashAlgorithm:sha3_256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256", + "HashAlgorithm:sha3_384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384", + "HashAlgorithm:sha3_512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512", + "HashAlgorithm:sha512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512", + "LifecycleScopeType:build": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/build", + "LifecycleScopeType:design": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/design", + "LifecycleScopeType:development": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/development", + "LifecycleScopeType:other": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/other", + "LifecycleScopeType:runtime": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/runtime", + "LifecycleScopeType:test": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/test", + "PresenceType:no": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no", + "PresenceType:noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion", + "PresenceType:yes": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes", + "ProfileIdentifierType:ai": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/ai", + "ProfileIdentifierType:build": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/build", + "ProfileIdentifierType:core": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/core", + "ProfileIdentifierType:dataset": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/dataset", + "ProfileIdentifierType:expandedLicensing": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/expandedLicensing", + "ProfileIdentifierType:extension": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/extension", + "ProfileIdentifierType:lite": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/lite", + "ProfileIdentifierType:security": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/security", + "ProfileIdentifierType:simpleLicensing": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/simpleLicensing", + "ProfileIdentifierType:software": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/software", + "RelationshipCompleteness:complete": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/complete", + "RelationshipCompleteness:incomplete": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/incomplete", + "RelationshipCompleteness:noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/noAssertion", + "RelationshipType:affects": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/affects", + "RelationshipType:amendedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/amendedBy", + "RelationshipType:ancestorOf": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/ancestorOf", + "RelationshipType:availableFrom": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/availableFrom", + "RelationshipType:configures": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/configures", + "RelationshipType:contains": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/contains", + "RelationshipType:coordinatedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/coordinatedBy", + "RelationshipType:copiedTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/copiedTo", + "RelationshipType:delegatedTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/delegatedTo", + "RelationshipType:dependsOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/dependsOn", + "RelationshipType:descendantOf": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/descendantOf", + "RelationshipType:describes": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/describes", + "RelationshipType:doesNotAffect": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/doesNotAffect", + "RelationshipType:expandsTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/expandsTo", + "RelationshipType:exploitCreatedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/exploitCreatedBy", + "RelationshipType:fixedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedBy", + "RelationshipType:fixedIn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedIn", + "RelationshipType:foundBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/foundBy", + "RelationshipType:generates": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/generates", + "RelationshipType:hasAddedFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAddedFile", + "RelationshipType:hasAssessmentFor": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssessmentFor", + "RelationshipType:hasAssociatedVulnerability": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssociatedVulnerability", + "RelationshipType:hasConcludedLicense": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasConcludedLicense", + "RelationshipType:hasDataFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDataFile", + "RelationshipType:hasDeclaredLicense": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeclaredLicense", + "RelationshipType:hasDeletedFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeletedFile", + "RelationshipType:hasDependencyManifest": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDependencyManifest", + "RelationshipType:hasDistributionArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDistributionArtifact", + "RelationshipType:hasDocumentation": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDocumentation", + "RelationshipType:hasDynamicLink": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDynamicLink", + "RelationshipType:hasEvidence": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasEvidence", + "RelationshipType:hasExample": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasExample", + "RelationshipType:hasHost": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasHost", + "RelationshipType:hasInputs": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasInputs", + "RelationshipType:hasMetadata": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasMetadata", + "RelationshipType:hasOptionalComponent": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalComponent", + "RelationshipType:hasOptionalDependency": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalDependency", + "RelationshipType:hasOutputs": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOutputs", + "RelationshipType:hasPrerequsite": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasPrerequsite", + "RelationshipType:hasProvidedDependency": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasProvidedDependency", + "RelationshipType:hasRequirement": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasRequirement", + "RelationshipType:hasSpecification": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasSpecification", + "RelationshipType:hasStaticLink": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasStaticLink", + "RelationshipType:hasTest": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTest", + "RelationshipType:hasTestCase": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTestCase", + "RelationshipType:hasVariant": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasVariant", + "RelationshipType:invokedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/invokedBy", + "RelationshipType:modifiedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/modifiedBy", + "RelationshipType:other": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/other", + "RelationshipType:packagedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/packagedBy", + "RelationshipType:patchedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/patchedBy", + "RelationshipType:publishedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/publishedBy", + "RelationshipType:reportedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/reportedBy", + "RelationshipType:republishedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/republishedBy", + "RelationshipType:serializedInArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/serializedInArtifact", + "RelationshipType:testedOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/testedOn", + "RelationshipType:trainedOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/trainedOn", + "RelationshipType:underInvestigationFor": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/underInvestigationFor", + "RelationshipType:usesTool": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/usesTool", + "SupportType:deployed": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/deployed", + "SupportType:development": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/development", + "SupportType:endOfSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/endOfSupport", + "SupportType:limitedSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/limitedSupport", + "SupportType:noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noAssertion", + "SupportType:noSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noSupport", + "SupportType:support": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/support", + "dataset_ConfidentialityLevelType:amber": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/amber", + "dataset_ConfidentialityLevelType:clear": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/clear", + "dataset_ConfidentialityLevelType:green": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/green", + "dataset_ConfidentialityLevelType:red": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/red", + "dataset_DatasetAvailabilityType:clickthrough": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/clickthrough", + "dataset_DatasetAvailabilityType:directDownload": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/directDownload", + "dataset_DatasetAvailabilityType:query": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/query", + "dataset_DatasetAvailabilityType:registration": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/registration", + "dataset_DatasetAvailabilityType:scrapingScript": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/scrapingScript", + "dataset_DatasetType:audio": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/audio", + "dataset_DatasetType:categorical": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/categorical", + "dataset_DatasetType:graph": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/graph", + "dataset_DatasetType:image": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/image", + "dataset_DatasetType:noAssertion": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/noAssertion", + "dataset_DatasetType:numeric": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/numeric", + "dataset_DatasetType:other": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/other", + "dataset_DatasetType:sensor": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/sensor", + "dataset_DatasetType:structured": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/structured", + "dataset_DatasetType:syntactic": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/syntactic", + "dataset_DatasetType:text": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/text", + "dataset_DatasetType:timeseries": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timeseries", + "dataset_DatasetType:timestamp": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timestamp", + "dataset_DatasetType:video": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/video", + "security_CvssSeverityType:critical": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical", + "security_CvssSeverityType:high": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high", + "security_CvssSeverityType:low": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low", + "security_CvssSeverityType:medium": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium", + "security_CvssSeverityType:none": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none", + "security_ExploitCatalogType:kev": "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/kev", + "security_ExploitCatalogType:other": "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/other", + "security_SsvcDecisionType:act": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/act", + "security_SsvcDecisionType:attend": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/attend", + "security_SsvcDecisionType:track": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/track", + "security_SsvcDecisionType:trackStar": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/trackStar", + "security_VexJustificationType:componentNotPresent": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/componentNotPresent", + "security_VexJustificationType:inlineMitigationsAlreadyExist": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/inlineMitigationsAlreadyExist", + "security_VexJustificationType:vulnerableCodeCannotBeControlledByAdversary": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeCannotBeControlledByAdversary", + "security_VexJustificationType:vulnerableCodeNotInExecutePath": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotInExecutePath", + "security_VexJustificationType:vulnerableCodeNotPresent": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotPresent", + "software_ContentIdentifierType:gitoid": "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/gitoid", + "software_ContentIdentifierType:swhid": "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/swhid", + "software_FileKindType:directory": "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/directory", + "software_FileKindType:file": "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/file", + "software_SbomType:analyzed": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/analyzed", + "software_SbomType:build": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/build", + "software_SbomType:deployed": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/deployed", + "software_SbomType:design": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/design", + "software_SbomType:runtime": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/runtime", + "software_SbomType:source": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/source", + "software_SoftwarePurpose:application": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application", + "software_SoftwarePurpose:archive": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive", + "software_SoftwarePurpose:bom": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom", + "software_SoftwarePurpose:configuration": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration", + "software_SoftwarePurpose:container": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container", + "software_SoftwarePurpose:data": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data", + "software_SoftwarePurpose:device": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device", + "software_SoftwarePurpose:deviceDriver": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver", + "software_SoftwarePurpose:diskImage": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage", + "software_SoftwarePurpose:documentation": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation", + "software_SoftwarePurpose:evidence": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence", + "software_SoftwarePurpose:executable": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable", + "software_SoftwarePurpose:file": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file", + "software_SoftwarePurpose:filesystemImage": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage", + "software_SoftwarePurpose:firmware": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware", + "software_SoftwarePurpose:framework": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework", + "software_SoftwarePurpose:install": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install", + "software_SoftwarePurpose:library": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library", + "software_SoftwarePurpose:manifest": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest", + "software_SoftwarePurpose:model": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model", + "software_SoftwarePurpose:module": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module", + "software_SoftwarePurpose:operatingSystem": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem", + "software_SoftwarePurpose:other": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other", + "software_SoftwarePurpose:patch": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch", + "software_SoftwarePurpose:platform": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform", + "software_SoftwarePurpose:requirement": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement", + "software_SoftwarePurpose:source": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source", + "software_SoftwarePurpose:specification": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification", + "software_SoftwarePurpose:test": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test", + "spdx:ExpandedLicensing/NoAssertionLicense": "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoAssertionLicense", + "spdx:ExpandedLicensing/NoneLicense": "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoneLicense", +} + + +# CLASSES +# The class that contains properties to describe energy consumption incurred +# by an AI model in different stages of its lifecycle. +@register("https://spdx.org/rdf/3.0.0/terms/AI/EnergyConsumption", compact_type="ai_EnergyConsumption", abstract=False) +class ai_EnergyConsumption(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the amount of energy consumed when finetuning the AI model that is + # being used in the AI system. + cls._add_property( + "ai_finetuningEnergyConsumption", + ListProp(ObjectProp(ai_EnergyConsumptionDescription, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/finetuningEnergyConsumption", + compact="ai_finetuningEnergyConsumption", + ) + # Specifies the amount of energy consumed during inference time by an AI model + # that is being used in the AI system. + cls._add_property( + "ai_inferenceEnergyConsumption", + ListProp(ObjectProp(ai_EnergyConsumptionDescription, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/inferenceEnergyConsumption", + compact="ai_inferenceEnergyConsumption", + ) + # Specifies the amount of energy consumed when training the AI model that is + # being used in the AI system. + cls._add_property( + "ai_trainingEnergyConsumption", + ListProp(ObjectProp(ai_EnergyConsumptionDescription, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/trainingEnergyConsumption", + compact="ai_trainingEnergyConsumption", + ) + + +# The class that helps note down the quantity of energy consumption and the unit +# used for measurement. +@register("https://spdx.org/rdf/3.0.0/terms/AI/EnergyConsumptionDescription", compact_type="ai_EnergyConsumptionDescription", abstract=False) +class ai_EnergyConsumptionDescription(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Represents the energy quantity. + cls._add_property( + "ai_energyQuantity", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/AI/energyQuantity", + min_count=1, + compact="ai_energyQuantity", + ) + # Specifies the unit in which energy is measured. + cls._add_property( + "ai_energyUnit", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/kilowattHour", "kilowattHour"), + ("https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/megajoule", "megajoule"), + ("https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/other", "other"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/AI/energyUnit", + min_count=1, + compact="ai_energyUnit", + ) + + +# Specifies the unit of energy consumption. +@register("https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType", compact_type="ai_EnergyUnitType", abstract=False) +class ai_EnergyUnitType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "kilowattHour": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/kilowattHour", + "megajoule": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/megajoule", + "other": "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/other", + } + # Kilowatt-hour. + kilowattHour = "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/kilowattHour" + # Megajoule. + megajoule = "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/megajoule" + # Any other units of energy measurement. + other = "https://spdx.org/rdf/3.0.0/terms/AI/EnergyUnitType/other" + + +# Specifies the safety risk level. +@register("https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType", compact_type="ai_SafetyRiskAssessmentType", abstract=False) +class ai_SafetyRiskAssessmentType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "high": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/high", + "low": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/low", + "medium": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/medium", + "serious": "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/serious", + } + # The second-highest level of risk posed by an AI system. + high = "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/high" + # Low/no risk is posed by an AI system. + low = "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/low" + # The third-highest level of risk posed by an AI system. + medium = "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/medium" + # The highest level of risk posed by an AI system. + serious = "https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/serious" + + +# Specifies the type of an annotation. +@register("https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType", compact_type="AnnotationType", abstract=False) +class AnnotationType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "other": "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/other", + "review": "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/review", + } + # Used to store extra information about an Element which is not part of a Review (e.g. extra information provided during the creation of the Element). + other = "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/other" + # Used when someone reviews the Element. + review = "https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/review" + + +# Provides information about the creation of the Element. +@register("https://spdx.org/rdf/3.0.0/terms/Core/CreationInfo", compact_type="CreationInfo", abstract=False) +class CreationInfo(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide consumers with comments by the creator of the Element about the + # Element. + cls._add_property( + "comment", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/comment", + compact="comment", + ) + # Identifies when the Element was originally created. + cls._add_property( + "created", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/created", + min_count=1, + compact="created", + ) + # Identifies who or what created the Element. + cls._add_property( + "createdBy", + ListProp(ObjectProp(Agent, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/createdBy", + min_count=1, + compact="createdBy", + ) + # Identifies the tooling that was used during the creation of the Element. + cls._add_property( + "createdUsing", + ListProp(ObjectProp(Tool, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/createdUsing", + compact="createdUsing", + ) + # Provides a reference number that can be used to understand how to parse and interpret an Element. + cls._add_property( + "specVersion", + StringProp(pattern=r"^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/specVersion", + min_count=1, + compact="specVersion", + ) + + +# A key with an associated value. +@register("https://spdx.org/rdf/3.0.0/terms/Core/DictionaryEntry", compact_type="DictionaryEntry", abstract=False) +class DictionaryEntry(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A key used in a generic key-value pair. + cls._add_property( + "key", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/key", + min_count=1, + compact="key", + ) + # A value used in a generic key-value pair. + cls._add_property( + "value", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/value", + compact="value", + ) + + +# Base domain class from which all other SPDX-3.0 domain classes derive. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Element", compact_type="Element", abstract=True) +class Element(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + "NoAssertionElement": "https://spdx.org/rdf/3.0.0/terms/Core/NoAssertionElement", + "NoneElement": "https://spdx.org/rdf/3.0.0/terms/Core/NoneElement", + } + # An Individual Value for Element representing a set of Elements of unknown + # identify or cardinality (number). + NoAssertionElement = "https://spdx.org/rdf/3.0.0/terms/Core/NoAssertionElement" + # An Individual Value for Element representing a set of Elements with + # cardinality (number/count) of zero. + NoneElement = "https://spdx.org/rdf/3.0.0/terms/Core/NoneElement" + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide consumers with comments by the creator of the Element about the + # Element. + cls._add_property( + "comment", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/comment", + compact="comment", + ) + # Provides information about the creation of the Element. + cls._add_property( + "creationInfo", + ObjectProp(CreationInfo, True), + iri="https://spdx.org/rdf/3.0.0/terms/Core/creationInfo", + min_count=1, + compact="creationInfo", + ) + # Provides a detailed description of the Element. + cls._add_property( + "description", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/description", + compact="description", + ) + # Specifies an Extension characterization of some aspect of an Element. + cls._add_property( + "extension", + ListProp(ObjectProp(extension_Extension, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/extension", + compact="extension", + ) + # Provides a reference to a resource outside the scope of SPDX-3.0 content + # that uniquely identifies an Element. + cls._add_property( + "externalIdentifier", + ListProp(ObjectProp(ExternalIdentifier, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/externalIdentifier", + compact="externalIdentifier", + ) + # Points to a resource outside the scope of the SPDX-3.0 content + # that provides additional characteristics of an Element. + cls._add_property( + "externalRef", + ListProp(ObjectProp(ExternalRef, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/externalRef", + compact="externalRef", + ) + # Identifies the name of an Element as designated by the creator. + cls._add_property( + "name", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/name", + compact="name", + ) + # A short description of an Element. + cls._add_property( + "summary", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/summary", + compact="summary", + ) + # Provides an IntegrityMethod with which the integrity of an Element can be + # asserted. + cls._add_property( + "verifiedUsing", + ListProp(ObjectProp(IntegrityMethod, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/verifiedUsing", + compact="verifiedUsing", + ) + + +# A collection of Elements, not necessarily with unifying context. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ElementCollection", compact_type="ElementCollection", abstract=True) +class ElementCollection(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Refers to one or more Elements that are part of an ElementCollection. + cls._add_property( + "element", + ListProp(ObjectProp(Element, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/element", + compact="element", + ) + # Describes one a profile which the creator of this ElementCollection intends to + # conform to. + cls._add_property( + "profileConformance", + ListProp(EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/ai", "ai"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/build", "build"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/core", "core"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/dataset", "dataset"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/expandedLicensing", "expandedLicensing"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/extension", "extension"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/lite", "lite"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/security", "security"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/simpleLicensing", "simpleLicensing"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/software", "software"), + ])), + iri="https://spdx.org/rdf/3.0.0/terms/Core/profileConformance", + compact="profileConformance", + ) + # This property is used to denote the root Element(s) of a tree of elements contained in a BOM. + cls._add_property( + "rootElement", + ListProp(ObjectProp(Element, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/rootElement", + compact="rootElement", + ) + + +# A reference to a resource identifier defined outside the scope of SPDX-3.0 content that uniquely identifies an Element. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifier", compact_type="ExternalIdentifier", abstract=False) +class ExternalIdentifier(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide consumers with comments by the creator of the Element about the + # Element. + cls._add_property( + "comment", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/comment", + compact="comment", + ) + # Specifies the type of the external identifier. + cls._add_property( + "externalIdentifierType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe22", "cpe22"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe23", "cpe23"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cve", "cve"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/email", "email"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/gitoid", "gitoid"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/packageUrl", "packageUrl"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/securityOther", "securityOther"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swhid", "swhid"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swid", "swid"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/urlScheme", "urlScheme"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/externalIdentifierType", + min_count=1, + compact="externalIdentifierType", + ) + # Uniquely identifies an external element. + cls._add_property( + "identifier", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/identifier", + min_count=1, + compact="identifier", + ) + # Provides the location for more information regarding an external identifier. + cls._add_property( + "identifierLocator", + ListProp(AnyURIProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Core/identifierLocator", + compact="identifierLocator", + ) + # An entity that is authorized to issue identification credentials. + cls._add_property( + "issuingAuthority", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/issuingAuthority", + compact="issuingAuthority", + ) + + +# Specifies the type of an external identifier. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType", compact_type="ExternalIdentifierType", abstract=False) +class ExternalIdentifierType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "cpe22": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe22", + "cpe23": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe23", + "cve": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cve", + "email": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/email", + "gitoid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/gitoid", + "other": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/other", + "packageUrl": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/packageUrl", + "securityOther": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/securityOther", + "swhid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swhid", + "swid": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swid", + "urlScheme": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/urlScheme", + } + # https://cpe.mitre.org/files/cpe-specification_2.2.pdf + cpe22 = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe22" + # https://nvlpubs.nist.gov/nistpubs/Legacy/IR/nistir7695.pdf + cpe23 = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cpe23" + # An identifier for a specific software flaw defined within the official CVE Dictionary and that conforms to the CVE specification as defined by https://csrc.nist.gov/glossary/term/cve_id. + cve = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/cve" + # https://datatracker.ietf.org/doc/html/rfc3696#section-3 + email = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/email" + # https://www.iana.org/assignments/uri-schemes/prov/gitoid Gitoid stands for [Git Object ID](https://git-scm.com/book/en/v2/Git-Internals-Git-Objects) and a gitoid of type blob is a unique hash of a binary artifact. A gitoid may represent the software [Artifact ID](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#artifact-id) or the [OmniBOR Identifier](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#omnibor-identifier) for the software artifact's associated [OmniBOR Document](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#omnibor-document); this ambiguity exists because the OmniBOR Document is itself an artifact, and the gitoid of that artifact is its valid identifier. Omnibor is a minimalistic schema to describe software [Artifact Dependency Graphs](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#artifact-dependency-graph-adg). Gitoids calculated on software artifacts (Snippet, File, or Package Elements) should be recorded in the SPDX 3.0 SoftwareArtifact's ContentIdentifier property. Gitoids calculated on the OmniBOR Document (OmniBOR Identifiers) should be recorded in the SPDX 3.0 Element's ExternalIdentifier property. + gitoid = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/gitoid" + # Used when the type doesn't match any of the other options. + other = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/other" + # https://github.com/package-url/purl-spec + packageUrl = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/packageUrl" + # Used when there is a security related identifier of unspecified type. + securityOther = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/securityOther" + # SoftWare Hash IDentifier, persistent intrinsic identifiers for digital artifacts, such as files, trees (also known as directories or folders), commits, and other objects typically found in version control systems. The syntax of the identifiers is defined in the [SWHID specification](https://www.swhid.org/specification/v1.1/4.Syntax) and they typically look like `swh:1:cnt:94a9ed024d3859793618152ea559a168bbcbb5e2`. + swhid = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swhid" + # https://www.ietf.org/archive/id/draft-ietf-sacm-coswid-21.html#section-2.3 + swid = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/swid" + # the scheme used in order to locate a resource https://www.iana.org/assignments/uri-schemes/uri-schemes.xhtml + urlScheme = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalIdentifierType/urlScheme" + + +# A map of Element identifiers that are used within a Document but defined external to that Document. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ExternalMap", compact_type="ExternalMap", abstract=False) +class ExternalMap(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Artifact representing a serialization instance of SPDX data containing the + # definition of a particular Element. + cls._add_property( + "definingArtifact", + ObjectProp(Artifact, False), + iri="https://spdx.org/rdf/3.0.0/terms/Core/definingArtifact", + compact="definingArtifact", + ) + # Identifies an external Element used within a Document but defined external to + # that Document. + cls._add_property( + "externalSpdxId", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/externalSpdxId", + min_count=1, + compact="externalSpdxId", + ) + # Provides an indication of where to retrieve an external Element. + cls._add_property( + "locationHint", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/locationHint", + compact="locationHint", + ) + # Provides an IntegrityMethod with which the integrity of an Element can be + # asserted. + cls._add_property( + "verifiedUsing", + ListProp(ObjectProp(IntegrityMethod, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/verifiedUsing", + compact="verifiedUsing", + ) + + +# A reference to a resource outside the scope of SPDX-3.0 content related to an Element. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRef", compact_type="ExternalRef", abstract=False) +class ExternalRef(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide consumers with comments by the creator of the Element about the + # Element. + cls._add_property( + "comment", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/comment", + compact="comment", + ) + # Specifies the media type of an Element or Property. + cls._add_property( + "contentType", + StringProp(pattern=r"^[^\/]+\/[^\/]+$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/contentType", + compact="contentType", + ) + # Specifies the type of the external reference. + cls._add_property( + "externalRefType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altDownloadLocation", "altDownloadLocation"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altWebPage", "altWebPage"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/binaryArtifact", "binaryArtifact"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/bower", "bower"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildMeta", "buildMeta"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildSystem", "buildSystem"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/certificationReport", "certificationReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/chat", "chat"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/componentAnalysisReport", "componentAnalysisReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/cwe", "cwe"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/documentation", "documentation"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/dynamicAnalysisReport", "dynamicAnalysisReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/eolNotice", "eolNotice"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/exportControlAssessment", "exportControlAssessment"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/funding", "funding"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/issueTracker", "issueTracker"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/license", "license"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mailingList", "mailingList"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mavenCentral", "mavenCentral"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/metrics", "metrics"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/npm", "npm"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/nuget", "nuget"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/privacyAssessment", "privacyAssessment"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/productMetadata", "productMetadata"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/purchaseOrder", "purchaseOrder"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/qualityAssessmentReport", "qualityAssessmentReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseHistory", "releaseHistory"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseNotes", "releaseNotes"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/riskAssessment", "riskAssessment"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/runtimeAnalysisReport", "runtimeAnalysisReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/secureSoftwareAttestation", "secureSoftwareAttestation"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdversaryModel", "securityAdversaryModel"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdvisory", "securityAdvisory"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityFix", "securityFix"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityOther", "securityOther"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPenTestReport", "securityPenTestReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPolicy", "securityPolicy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityThreatModel", "securityThreatModel"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/socialMedia", "socialMedia"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/sourceArtifact", "sourceArtifact"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/staticAnalysisReport", "staticAnalysisReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/support", "support"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vcs", "vcs"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityDisclosureReport", "vulnerabilityDisclosureReport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityExploitabilityAssessment", "vulnerabilityExploitabilityAssessment"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/externalRefType", + compact="externalRefType", + ) + # Provides the location of an external reference. + cls._add_property( + "locator", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Core/locator", + compact="locator", + ) + + +# Specifies the type of an external reference. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType", compact_type="ExternalRefType", abstract=False) +class ExternalRefType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "altDownloadLocation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altDownloadLocation", + "altWebPage": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altWebPage", + "binaryArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/binaryArtifact", + "bower": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/bower", + "buildMeta": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildMeta", + "buildSystem": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildSystem", + "certificationReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/certificationReport", + "chat": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/chat", + "componentAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/componentAnalysisReport", + "cwe": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/cwe", + "documentation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/documentation", + "dynamicAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/dynamicAnalysisReport", + "eolNotice": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/eolNotice", + "exportControlAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/exportControlAssessment", + "funding": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/funding", + "issueTracker": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/issueTracker", + "license": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/license", + "mailingList": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mailingList", + "mavenCentral": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mavenCentral", + "metrics": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/metrics", + "npm": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/npm", + "nuget": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/nuget", + "other": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/other", + "privacyAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/privacyAssessment", + "productMetadata": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/productMetadata", + "purchaseOrder": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/purchaseOrder", + "qualityAssessmentReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/qualityAssessmentReport", + "releaseHistory": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseHistory", + "releaseNotes": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseNotes", + "riskAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/riskAssessment", + "runtimeAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/runtimeAnalysisReport", + "secureSoftwareAttestation": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/secureSoftwareAttestation", + "securityAdversaryModel": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdversaryModel", + "securityAdvisory": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdvisory", + "securityFix": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityFix", + "securityOther": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityOther", + "securityPenTestReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPenTestReport", + "securityPolicy": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPolicy", + "securityThreatModel": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityThreatModel", + "socialMedia": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/socialMedia", + "sourceArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/sourceArtifact", + "staticAnalysisReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/staticAnalysisReport", + "support": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/support", + "vcs": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vcs", + "vulnerabilityDisclosureReport": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityDisclosureReport", + "vulnerabilityExploitabilityAssessment": "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityExploitabilityAssessment", + } + # A reference to an alternative download location. + altDownloadLocation = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altDownloadLocation" + # A reference to an alternative web page. + altWebPage = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/altWebPage" + # A reference to binary artifacts related to a package. + binaryArtifact = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/binaryArtifact" + # A reference to a bower package. + bower = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/bower" + # A reference build metadata related to a published package. + buildMeta = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildMeta" + # A reference build system used to create or publish the package. + buildSystem = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/buildSystem" + # A reference to a certification report for a package from an accredited/independent body. + certificationReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/certificationReport" + # A reference to the instant messaging system used by the maintainer for a package. + chat = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/chat" + # A reference to a Software Composition Analysis (SCA) report. + componentAnalysisReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/componentAnalysisReport" + # A reference to a source of software flaw defined within the official CWE Dictionary that conforms to the CWE specification as defined by https://csrc.nist.gov/glossary/term/common_weakness_enumeration. + cwe = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/cwe" + # A reference to the documentation for a package. + documentation = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/documentation" + # A reference to a dynamic analysis report for a package. + dynamicAnalysisReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/dynamicAnalysisReport" + # A reference to the End Of Sale (EOS) and/or End Of Life (EOL) information related to a package. + eolNotice = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/eolNotice" + # A reference to a export control assessment for a package. + exportControlAssessment = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/exportControlAssessment" + # A reference to funding information related to a package. + funding = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/funding" + # A reference to the issue tracker for a package. + issueTracker = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/issueTracker" + # A reference to additional license information related to an artifact. + license = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/license" + # A reference to the mailing list used by the maintainer for a package. + mailingList = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mailingList" + # A reference to a maven repository artifact. + mavenCentral = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/mavenCentral" + # A reference to metrics related to package such as OpenSSF scorecards. + metrics = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/metrics" + # A reference to an npm package. + npm = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/npm" + # A reference to a nuget package. + nuget = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/nuget" + # Used when the type doesn't match any of the other options. + other = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/other" + # A reference to a privacy assessment for a package. + privacyAssessment = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/privacyAssessment" + # A reference to additional product metadata such as reference within organization's product catalog. + productMetadata = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/productMetadata" + # A reference to a purchase order for a package. + purchaseOrder = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/purchaseOrder" + # A reference to a quality assessment for a package. + qualityAssessmentReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/qualityAssessmentReport" + # A reference to a published list of releases for a package. + releaseHistory = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseHistory" + # A reference to the release notes for a package. + releaseNotes = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/releaseNotes" + # A reference to a risk assessment for a package. + riskAssessment = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/riskAssessment" + # A reference to a runtime analysis report for a package. + runtimeAnalysisReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/runtimeAnalysisReport" + # A reference to information assuring that the software is developed using security practices as defined by [NIST SP 800-218 Secure Software Development Framework (SSDF) Version 1.1](https://csrc.nist.gov/pubs/sp/800/218/final) or [CISA Secure Software Development Attestation Form](https://www.cisa.gov/resources-tools/resources/secure-software-development-attestation-form). + secureSoftwareAttestation = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/secureSoftwareAttestation" + # A reference to the security adversary model for a package. + securityAdversaryModel = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdversaryModel" + # A reference to a published security advisory (where advisory as defined per ISO 29147:2018) that may affect one or more elements, e.g., vendor advisories or specific NVD entries. + securityAdvisory = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityAdvisory" + # A reference to the patch or source code that fixes a vulnerability. + securityFix = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityFix" + # A reference to related security information of unspecified type. + securityOther = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityOther" + # A reference to a [penetration test](https://en.wikipedia.org/wiki/Penetration_test) report for a package. + securityPenTestReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPenTestReport" + # A reference to instructions for reporting newly discovered security vulnerabilities for a package. + securityPolicy = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityPolicy" + # A reference the [security threat model](https://en.wikipedia.org/wiki/Threat_model) for a package. + securityThreatModel = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/securityThreatModel" + # A reference to a social media channel for a package. + socialMedia = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/socialMedia" + # A reference to an artifact containing the sources for a package. + sourceArtifact = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/sourceArtifact" + # A reference to a static analysis report for a package. + staticAnalysisReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/staticAnalysisReport" + # A reference to the software support channel or other support information for a package. + support = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/support" + # A reference to a version control system related to a software artifact. + vcs = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vcs" + # A reference to a Vulnerability Disclosure Report (VDR) which provides the software supplier's analysis and findings describing the impact (or lack of impact) that reported vulnerabilities have on packages or products in the supplier's SBOM as defined in [NIST SP 800-161](https://csrc.nist.gov/pubs/sp/800/161/r1/final). + vulnerabilityDisclosureReport = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityDisclosureReport" + # A reference to a Vulnerability Exploitability eXchange (VEX) statement which provides information on whether a product is impacted by a specific vulnerability in an included package and, if affected, whether there are actions recommended to remediate. See also [NTIA VEX one-page summary](https://ntia.gov/files/ntia/publications/vex_one-page_summary.pdf). + vulnerabilityExploitabilityAssessment = "https://spdx.org/rdf/3.0.0/terms/Core/ExternalRefType/vulnerabilityExploitabilityAssessment" + + +# A mathematical algorithm that maps data of arbitrary size to a bit string. +@register("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm", compact_type="HashAlgorithm", abstract=False) +class HashAlgorithm(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "blake2b256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256", + "blake2b384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384", + "blake2b512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512", + "blake3": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3", + "crystalsDilithium": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium", + "crystalsKyber": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber", + "falcon": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon", + "md2": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2", + "md4": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4", + "md5": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5", + "md6": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6", + "other": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other", + "sha1": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1", + "sha224": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224", + "sha256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256", + "sha384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384", + "sha3_224": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224", + "sha3_256": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256", + "sha3_384": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384", + "sha3_512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512", + "sha512": "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512", + } + # blake2b algorithm with a digest size of 256 https://datatracker.ietf.org/doc/html/rfc7693#section-4 + blake2b256 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256" + # blake2b algorithm with a digest size of 384 https://datatracker.ietf.org/doc/html/rfc7693#section-4 + blake2b384 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384" + # blake2b algorithm with a digest size of 512 https://datatracker.ietf.org/doc/html/rfc7693#section-4 + blake2b512 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512" + # https://github.com/BLAKE3-team/BLAKE3-specs/blob/master/blake3.pdf + blake3 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3" + # https://pq-crystals.org/dilithium/index.shtml + crystalsDilithium = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium" + # https://pq-crystals.org/kyber/index.shtml + crystalsKyber = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber" + # https://falcon-sign.info/falcon.pdf + falcon = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon" + # https://datatracker.ietf.org/doc/rfc1319/ + md2 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2" + # https://datatracker.ietf.org/doc/html/rfc1186 + md4 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4" + # https://datatracker.ietf.org/doc/html/rfc1321 + md5 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5" + # https://people.csail.mit.edu/rivest/pubs/RABCx08.pdf + md6 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6" + # any hashing algorithm that does not exist in this list of entries + other = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other" + # https://datatracker.ietf.org/doc/html/rfc3174 + sha1 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1" + # secure hashing algorithm with a digest length of 224 https://datatracker.ietf.org/doc/html/draft-ietf-pkix-sha224-01 + sha224 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224" + # secure hashing algorithm with a digest length of 256 https://www.rfc-editor.org/rfc/rfc4634 + sha256 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256" + # secure hashing algorithm with a digest length of 384 https://www.rfc-editor.org/rfc/rfc4634 + sha384 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384" + # sha3 with a digest length of 224 https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf + sha3_224 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224" + # sha3 with a digest length of 256 https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf + sha3_256 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256" + # sha3 with a digest length of 384 https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf + sha3_384 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384" + # sha3 with a digest length of 512 https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf + sha3_512 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512" + # secure hashing algorithm with a digest length of 512 https://www.rfc-editor.org/rfc/rfc4634 + sha512 = "https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512" + + +# Provides an independently reproducible mechanism that permits verification of a specific Element. +@register("https://spdx.org/rdf/3.0.0/terms/Core/IntegrityMethod", compact_type="IntegrityMethod", abstract=True) +class IntegrityMethod(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide consumers with comments by the creator of the Element about the + # Element. + cls._add_property( + "comment", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/comment", + compact="comment", + ) + + +# Provide an enumerated set of lifecycle phases that can provide context to relationships. +@register("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType", compact_type="LifecycleScopeType", abstract=False) +class LifecycleScopeType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "build": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/build", + "design": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/design", + "development": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/development", + "other": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/other", + "runtime": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/runtime", + "test": "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/test", + } + # A relationship has specific context implications during an element's build phase, during development. + build = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/build" + # A relationship has specific context implications during an element's design. + design = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/design" + # A relationship has specific context implications during development phase of an element. + development = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/development" + # A relationship has other specific context information necessary to capture that the above set of enumerations does not handle. + other = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/other" + # A relationship has specific context implications during the execution phase of an element. + runtime = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/runtime" + # A relationship has specific context implications during an element's testing phase, during development. + test = "https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/test" + + +# A mapping between prefixes and namespace partial URIs. +@register("https://spdx.org/rdf/3.0.0/terms/Core/NamespaceMap", compact_type="NamespaceMap", abstract=False) +class NamespaceMap(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides an unambiguous mechanism for conveying a URI fragment portion of an + # ElementID. + cls._add_property( + "namespace", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/namespace", + min_count=1, + compact="namespace", + ) + # A substitute for a URI. + cls._add_property( + "prefix", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/prefix", + min_count=1, + compact="prefix", + ) + + +# An SPDX version 2.X compatible verification method for software packages. +@register("https://spdx.org/rdf/3.0.0/terms/Core/PackageVerificationCode", compact_type="PackageVerificationCode", abstract=False) +class PackageVerificationCode(IntegrityMethod): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the algorithm used for calculating the hash value. + cls._add_property( + "algorithm", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256", "blake2b256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384", "blake2b384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512", "blake2b512"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3", "blake3"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium", "crystalsDilithium"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber", "crystalsKyber"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon", "falcon"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2", "md2"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4", "md4"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5", "md5"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6", "md6"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1", "sha1"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224", "sha224"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256", "sha256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384", "sha384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224", "sha3_224"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256", "sha3_256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384", "sha3_384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512", "sha3_512"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512", "sha512"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/algorithm", + min_count=1, + compact="algorithm", + ) + # The result of applying a hash algorithm to an Element. + cls._add_property( + "hashValue", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/hashValue", + min_count=1, + compact="hashValue", + ) + # The relative file name of a file to be excluded from the + # `PackageVerificationCode`. + cls._add_property( + "packageVerificationCodeExcludedFile", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Core/packageVerificationCodeExcludedFile", + compact="packageVerificationCodeExcludedFile", + ) + + +# A tuple of two positive integers that define a range. +@register("https://spdx.org/rdf/3.0.0/terms/Core/PositiveIntegerRange", compact_type="PositiveIntegerRange", abstract=False) +class PositiveIntegerRange(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Defines the beginning of a range. + cls._add_property( + "beginIntegerRange", + PositiveIntegerProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/beginIntegerRange", + min_count=1, + compact="beginIntegerRange", + ) + # Defines the end of a range. + cls._add_property( + "endIntegerRange", + PositiveIntegerProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/endIntegerRange", + min_count=1, + compact="endIntegerRange", + ) + + +# Categories of presence or absence. +@register("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType", compact_type="PresenceType", abstract=False) +class PresenceType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "no": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no", + "noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion", + "yes": "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes", + } + # Indicates absence of the field. + no = "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no" + # Makes no assertion about the field. + noAssertion = "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion" + # Indicates presence of the field. + yes = "https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes" + + +# Enumeration of the valid profiles. +@register("https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType", compact_type="ProfileIdentifierType", abstract=False) +class ProfileIdentifierType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "ai": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/ai", + "build": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/build", + "core": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/core", + "dataset": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/dataset", + "expandedLicensing": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/expandedLicensing", + "extension": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/extension", + "lite": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/lite", + "security": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/security", + "simpleLicensing": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/simpleLicensing", + "software": "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/software", + } + # the element follows the AI profile specification + ai = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/ai" + # the element follows the Build profile specification + build = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/build" + # the element follows the Core profile specification + core = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/core" + # the element follows the Dataset profile specification + dataset = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/dataset" + # the element follows the expanded Licensing profile + expandedLicensing = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/expandedLicensing" + # the element follows the Extension profile specification + extension = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/extension" + # the element follows the Lite profile specification + lite = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/lite" + # the element follows the Security profile specification + security = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/security" + # the element follows the simple Licensing profile + simpleLicensing = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/simpleLicensing" + # the element follows the Software profile specification + software = "https://spdx.org/rdf/3.0.0/terms/Core/ProfileIdentifierType/software" + + +# Describes a relationship between one or more elements. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Relationship", compact_type="Relationship", abstract=False) +class Relationship(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides information about the completeness of relationships. + cls._add_property( + "completeness", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/complete", "complete"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/incomplete", "incomplete"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/noAssertion", "noAssertion"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/completeness", + compact="completeness", + ) + # Specifies the time from which an element is no longer applicable / valid. + cls._add_property( + "endTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/endTime", + compact="endTime", + ) + # References the Element on the left-hand side of a relationship. + cls._add_property( + "from_", + ObjectProp(Element, True), + iri="https://spdx.org/rdf/3.0.0/terms/Core/from", + min_count=1, + compact="from", + ) + # Information about the relationship between two Elements. + cls._add_property( + "relationshipType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/affects", "affects"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/amendedBy", "amendedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/ancestorOf", "ancestorOf"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/availableFrom", "availableFrom"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/configures", "configures"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/contains", "contains"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/coordinatedBy", "coordinatedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/copiedTo", "copiedTo"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/delegatedTo", "delegatedTo"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/dependsOn", "dependsOn"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/descendantOf", "descendantOf"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/describes", "describes"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/doesNotAffect", "doesNotAffect"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/expandsTo", "expandsTo"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/exploitCreatedBy", "exploitCreatedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedBy", "fixedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedIn", "fixedIn"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/foundBy", "foundBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/generates", "generates"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAddedFile", "hasAddedFile"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssessmentFor", "hasAssessmentFor"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssociatedVulnerability", "hasAssociatedVulnerability"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasConcludedLicense", "hasConcludedLicense"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDataFile", "hasDataFile"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeclaredLicense", "hasDeclaredLicense"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeletedFile", "hasDeletedFile"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDependencyManifest", "hasDependencyManifest"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDistributionArtifact", "hasDistributionArtifact"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDocumentation", "hasDocumentation"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDynamicLink", "hasDynamicLink"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasEvidence", "hasEvidence"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasExample", "hasExample"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasHost", "hasHost"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasInputs", "hasInputs"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasMetadata", "hasMetadata"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalComponent", "hasOptionalComponent"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalDependency", "hasOptionalDependency"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOutputs", "hasOutputs"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasPrerequsite", "hasPrerequsite"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasProvidedDependency", "hasProvidedDependency"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasRequirement", "hasRequirement"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasSpecification", "hasSpecification"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasStaticLink", "hasStaticLink"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTest", "hasTest"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTestCase", "hasTestCase"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasVariant", "hasVariant"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/invokedBy", "invokedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/modifiedBy", "modifiedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/packagedBy", "packagedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/patchedBy", "patchedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/publishedBy", "publishedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/reportedBy", "reportedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/republishedBy", "republishedBy"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/serializedInArtifact", "serializedInArtifact"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/testedOn", "testedOn"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/trainedOn", "trainedOn"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/underInvestigationFor", "underInvestigationFor"), + ("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/usesTool", "usesTool"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/relationshipType", + min_count=1, + compact="relationshipType", + ) + # Specifies the time from which an element is applicable / valid. + cls._add_property( + "startTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/startTime", + compact="startTime", + ) + # References an Element on the right-hand side of a relationship. + cls._add_property( + "to", + ListProp(ObjectProp(Element, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/to", + min_count=1, + compact="to", + ) + + +# Indicates whether a relationship is known to be complete, incomplete, or if no assertion is made with respect to relationship completeness. +@register("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness", compact_type="RelationshipCompleteness", abstract=False) +class RelationshipCompleteness(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "complete": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/complete", + "incomplete": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/incomplete", + "noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/noAssertion", + } + # The relationship is known to be exhaustive. + complete = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/complete" + # The relationship is known not to be exhaustive. + incomplete = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/incomplete" + # No assertion can be made about the completeness of the relationship. + noAssertion = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipCompleteness/noAssertion" + + +# Information about the relationship between two Elements. +@register("https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType", compact_type="RelationshipType", abstract=False) +class RelationshipType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "affects": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/affects", + "amendedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/amendedBy", + "ancestorOf": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/ancestorOf", + "availableFrom": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/availableFrom", + "configures": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/configures", + "contains": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/contains", + "coordinatedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/coordinatedBy", + "copiedTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/copiedTo", + "delegatedTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/delegatedTo", + "dependsOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/dependsOn", + "descendantOf": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/descendantOf", + "describes": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/describes", + "doesNotAffect": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/doesNotAffect", + "expandsTo": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/expandsTo", + "exploitCreatedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/exploitCreatedBy", + "fixedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedBy", + "fixedIn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedIn", + "foundBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/foundBy", + "generates": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/generates", + "hasAddedFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAddedFile", + "hasAssessmentFor": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssessmentFor", + "hasAssociatedVulnerability": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssociatedVulnerability", + "hasConcludedLicense": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasConcludedLicense", + "hasDataFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDataFile", + "hasDeclaredLicense": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeclaredLicense", + "hasDeletedFile": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeletedFile", + "hasDependencyManifest": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDependencyManifest", + "hasDistributionArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDistributionArtifact", + "hasDocumentation": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDocumentation", + "hasDynamicLink": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDynamicLink", + "hasEvidence": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasEvidence", + "hasExample": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasExample", + "hasHost": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasHost", + "hasInputs": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasInputs", + "hasMetadata": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasMetadata", + "hasOptionalComponent": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalComponent", + "hasOptionalDependency": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalDependency", + "hasOutputs": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOutputs", + "hasPrerequsite": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasPrerequsite", + "hasProvidedDependency": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasProvidedDependency", + "hasRequirement": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasRequirement", + "hasSpecification": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasSpecification", + "hasStaticLink": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasStaticLink", + "hasTest": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTest", + "hasTestCase": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTestCase", + "hasVariant": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasVariant", + "invokedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/invokedBy", + "modifiedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/modifiedBy", + "other": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/other", + "packagedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/packagedBy", + "patchedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/patchedBy", + "publishedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/publishedBy", + "reportedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/reportedBy", + "republishedBy": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/republishedBy", + "serializedInArtifact": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/serializedInArtifact", + "testedOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/testedOn", + "trainedOn": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/trainedOn", + "underInvestigationFor": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/underInvestigationFor", + "usesTool": "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/usesTool", + } + # (Security/VEX) The `from` vulnerability affect each `to` Element + affects = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/affects" + # The `from` Element is amended by each `to` Element + amendedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/amendedBy" + # The `from` Element is an ancestor of each `to` Element + ancestorOf = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/ancestorOf" + # The `from` Element is available from the additional supplier described by each `to` Element + availableFrom = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/availableFrom" + # The `from` Element is a configuration applied to each `to` Element during a LifecycleScopeType period + configures = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/configures" + # The `from` Element contains each `to` Element + contains = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/contains" + # (Security) The `from` Vulnerability is coordinatedBy the `to` Agent(s) (vendor, researcher, or consumer agent) + coordinatedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/coordinatedBy" + # The `from` Element has been copied to each `to` Element + copiedTo = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/copiedTo" + # The `from` Agent is delegating an action to the Agent of the `to` Relationship (which must be of type invokedBy) during a LifecycleScopeType. (e.g. the `to` invokedBy Relationship is being done on behalf of `from`) + delegatedTo = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/delegatedTo" + # The `from` Element depends on each `to` Element during a LifecycleScopeType period. + dependsOn = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/dependsOn" + # The `from` Element is a descendant of each `to` Element + descendantOf = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/descendantOf" + # The `from` Element describes each `to` Element. To denote the root(s) of a tree of elements in a collection, the rootElement property should be used. + describes = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/describes" + # (Security/VEX) The `from` Vulnerability has no impact on each `to` Element + doesNotAffect = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/doesNotAffect" + # The `from` archive expands out as an artifact described by each `to` Element + expandsTo = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/expandsTo" + # (Security) The `from` Vulnerability has had an exploit created against it by each `to` Agent + exploitCreatedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/exploitCreatedBy" + # (Security) Designates a `from` Vulnerability has been fixed by the `to` Agent(s) + fixedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedBy" + # (Security/VEX) A `from` Vulnerability has been fixed in each of the `to` Element(s) + fixedIn = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/fixedIn" + # (Security) Designates a `from` Vulnerability was originally discovered by the `to` Agent(s) + foundBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/foundBy" + # The `from` Element generates each `to` Element + generates = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/generates" + # Every `to` Element is is a file added to the `from` Element (`from` hasAddedFile `to`) + hasAddedFile = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAddedFile" + # (Security) Relates a `from` Vulnerability and each `to` Element(s) with a security assessment. To be used with `VulnAssessmentRelationship` types + hasAssessmentFor = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssessmentFor" + # (Security) Used to associate a `from` Artifact with each `to` Vulnerability + hasAssociatedVulnerability = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasAssociatedVulnerability" + # The `from` Software Artifact is concluded by the SPDX data creator to be governed by each `to` license + hasConcludedLicense = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasConcludedLicense" + # The `from` Element treats each `to` Element as a data file + hasDataFile = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDataFile" + # The `from` Software Artifact was discovered to actually contain each `to` license, for example as detected by use of automated tooling. + hasDeclaredLicense = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeclaredLicense" + # Every `to` Element is a file deleted from the `from` Element (`from` hasDeletedFile `to`) + hasDeletedFile = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDeletedFile" + # The `from` Element has manifest files that contain dependency information in each `to` Element + hasDependencyManifest = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDependencyManifest" + # The `from` Element is distributed as an artifact in each Element `to`, (e.g. an RPM or archive file) + hasDistributionArtifact = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDistributionArtifact" + # The `from` Element is documented by each `to` Element + hasDocumentation = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDocumentation" + # The `from` Element dynamically links in each `to` Element, during a LifecycleScopeType period. + hasDynamicLink = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasDynamicLink" + # (Dataset) Every `to` Element is considered as evidence for the `from` Element (`from` hasEvidence `to`) + hasEvidence = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasEvidence" + # Every `to` Element is an example for the `from` Element (`from` hasExample `to`) + hasExample = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasExample" + # The `from` Build was run on the `to` Element during a LifecycleScopeType period (e.g. The host that the build runs on) + hasHost = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasHost" + # The `from` Build has each `to` Elements as an input during a LifecycleScopeType period. + hasInputs = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasInputs" + # Every `to` Element is metadata about the `from` Element (`from` hasMetadata `to`) + hasMetadata = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasMetadata" + # Every `to` Element is an optional component of the `from` Element (`from` hasOptionalComponent `to`) + hasOptionalComponent = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalComponent" + # The `from` Element optionally depends on each `to` Element during a LifecycleScopeType period + hasOptionalDependency = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOptionalDependency" + # The `from` Build element generates each `to` Element as an output during a LifecycleScopeType period. + hasOutputs = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasOutputs" + # The `from` Element has a prerequsite on each `to` Element, during a LifecycleScopeType period + hasPrerequsite = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasPrerequsite" + # The `from` Element has a dependency on each `to` Element, but dependency is not in the distributed artifact, but assumed to be provided, during a LifecycleScopeType period + hasProvidedDependency = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasProvidedDependency" + # The `from` Element has a requirement on each `to` Element, during a LifecycleScopeType period + hasRequirement = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasRequirement" + # Every `to` Element is a specification for the `from` Element (`from` hasSpecification `to`), during a LifecycleScopeType period + hasSpecification = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasSpecification" + # The `from` Element statically links in each `to` Element, during a LifecycleScopeType period + hasStaticLink = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasStaticLink" + # Every `to` Element is a test artifact for the `from` Element (`from` hasTest `to`), during a LifecycleScopeType period + hasTest = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTest" + # Every `to` Element is a test case for the `from` Element (`from` hasTestCase `to`) + hasTestCase = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasTestCase" + # Every `to` Element is a variant the `from` Element (`from` hasVariant `to`) + hasVariant = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/hasVariant" + # The `from` Element was invoked by the `to` Agent during a LifecycleScopeType period (for example, a Build element that describes a build step) + invokedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/invokedBy" + # The `from` Element is modified by each `to` Element + modifiedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/modifiedBy" + # Every `to` Element is related to the `from` Element where the relationship type is not described by any of the SPDX relationhip types (this relationship is directionless) + other = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/other" + # Every `to` Element is a packaged instance of the `from` Element (`from` packagedBy `to`) + packagedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/packagedBy" + # Every `to` Element is a patch for the `from` Element (`from` patchedBy `to`) + patchedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/patchedBy" + # (Security) Designates a `from` Vulnerability was made available for public use or reference by each `to` Agent + publishedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/publishedBy" + # (Security) Designates a `from` Vulnerability was first reported to a project, vendor, or tracking database for formal identification by each `to` Agent + reportedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/reportedBy" + # (Security) Designates a `from` Vulnerability's details were tracked, aggregated, and/or enriched to improve context (i.e. NVD) by a `to` Agent(s) + republishedBy = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/republishedBy" + # The `from` SPDXDocument can be found in a serialized form in each `to` Artifact + serializedInArtifact = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/serializedInArtifact" + # (AI, Dataset) The `from` Element has been tested on the `to` Element + testedOn = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/testedOn" + # (AI, Dataset) The `from` Element has been trained by the `to` Element(s) + trainedOn = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/trainedOn" + # (Security/VEX) The `from` Vulnerability impact is being investigated for each `to` Element + underInvestigationFor = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/underInvestigationFor" + # The `from` Element uses each `to` Element as a tool during a LifecycleScopeType period. + usesTool = "https://spdx.org/rdf/3.0.0/terms/Core/RelationshipType/usesTool" + + +# A collection of SPDX Elements that could potentially be serialized. +@register("https://spdx.org/rdf/3.0.0/terms/Core/SpdxDocument", compact_type="SpdxDocument", abstract=False) +class SpdxDocument(ElementCollection): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides the license under which the SPDX documentation of the Element can be + # used. + cls._add_property( + "dataLicense", + ObjectProp(simplelicensing_AnyLicenseInfo, False), + iri="https://spdx.org/rdf/3.0.0/terms/Core/dataLicense", + compact="dataLicense", + ) + # Provides an ExternalMap of Element identifiers. + cls._add_property( + "imports", + ListProp(ObjectProp(ExternalMap, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/imports", + compact="imports", + ) + # Provides a NamespaceMap of prefixes and associated namespace partial URIs applicable to an SpdxDocument and independent of any specific serialization format or instance. + cls._add_property( + "namespaceMap", + ListProp(ObjectProp(NamespaceMap, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/namespaceMap", + compact="namespaceMap", + ) + + +# Indicates the type of support that is associated with an artifact. +@register("https://spdx.org/rdf/3.0.0/terms/Core/SupportType", compact_type="SupportType", abstract=False) +class SupportType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "deployed": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/deployed", + "development": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/development", + "endOfSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/endOfSupport", + "limitedSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/limitedSupport", + "noAssertion": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noAssertion", + "noSupport": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noSupport", + "support": "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/support", + } + # in addition to being supported by the supplier, the software is known to have been deployed and is in use. For a software as a service provider, this implies the software is now available as a service. + deployed = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/deployed" + # the artifact is in active development and is not considered ready for formal support from the supplier. + development = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/development" + # there is a defined end of support for the artifact from the supplier. This may also be referred to as end of life. There is a validUntilDate that can be used to signal when support ends for the artifact. + endOfSupport = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/endOfSupport" + # the artifact has been released, and there is limited support available from the supplier. There is a validUntilDate that can provide additional information about the duration of support. + limitedSupport = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/limitedSupport" + # no assertion about the type of support is made. This is considered the default if no other support type is used. + noAssertion = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noAssertion" + # there is no support for the artifact from the supplier, consumer assumes any support obligations. + noSupport = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noSupport" + # the artifact has been released, and is supported from the supplier. There is a validUntilDate that can provide additional information about the duration of support. + support = "https://spdx.org/rdf/3.0.0/terms/Core/SupportType/support" + + +# An element of hardware and/or software utilized to carry out a particular function. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Tool", compact_type="Tool", abstract=False) +class Tool(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Categories of confidentiality level. +@register("https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType", compact_type="dataset_ConfidentialityLevelType", abstract=False) +class dataset_ConfidentialityLevelType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "amber": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/amber", + "clear": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/clear", + "green": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/green", + "red": "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/red", + } + # Data points in the dataset can be shared only with specific + amber = "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/amber" + # Dataset may be distributed freely, without restriction. + clear = "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/clear" + # Dataset can be shared within a community of peers and partners. + green = "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/green" + # Data points in the dataset are highly confidential and can only be shared + red = "https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/red" + + +# Availability of dataset. +@register("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType", compact_type="dataset_DatasetAvailabilityType", abstract=False) +class dataset_DatasetAvailabilityType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "clickthrough": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/clickthrough", + "directDownload": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/directDownload", + "query": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/query", + "registration": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/registration", + "scrapingScript": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/scrapingScript", + } + # the dataset is not publicly available and can only be accessed + clickthrough = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/clickthrough" + # the dataset is publicly available and can be downloaded + directDownload = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/directDownload" + # the dataset is publicly available, but not all at once, and can only + query = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/query" + # the dataset is not publicly available and an email registration + registration = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/registration" + # the dataset provider is not making available the underlying + scrapingScript = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/scrapingScript" + + +# Enumeration of dataset types. +@register("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType", compact_type="dataset_DatasetType", abstract=False) +class dataset_DatasetType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "audio": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/audio", + "categorical": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/categorical", + "graph": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/graph", + "image": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/image", + "noAssertion": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/noAssertion", + "numeric": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/numeric", + "other": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/other", + "sensor": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/sensor", + "structured": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/structured", + "syntactic": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/syntactic", + "text": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/text", + "timeseries": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timeseries", + "timestamp": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timestamp", + "video": "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/video", + } + # data is audio based, such as a collection of music from the 80s. + audio = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/audio" + # data that is classified into a discrete number of categories, + categorical = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/categorical" + # data is in the form of a graph where entries are somehow related to + graph = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/graph" + # data is a collection of images such as pictures of animals. + image = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/image" + # data type is not known. + noAssertion = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/noAssertion" + # data consists only of numeric entries. + numeric = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/numeric" + # data is of a type not included in this list. + other = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/other" + # data is recorded from a physical sensor, such as a thermometer + sensor = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/sensor" + # data is stored in tabular format or retrieved from a relational + structured = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/structured" + # data describes the syntax or semantics of a language or text, such + syntactic = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/syntactic" + # data consists of unstructured text, such as a book, Wikipedia article + text = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/text" + # data is recorded in an ordered sequence of timestamped entries, + timeseries = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timeseries" + # data is recorded with a timestamp for each entry, but not + timestamp = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timestamp" + # data is video based, such as a collection of movie clips featuring Tom + video = "https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/video" + + +# Abstract class for additional text intended to be added to a License, but +# which is not itself a standalone License. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/LicenseAddition", compact_type="expandedlicensing_LicenseAddition", abstract=True) +class expandedlicensing_LicenseAddition(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Identifies the full text of a LicenseAddition. + cls._add_property( + "expandedlicensing_additionText", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/additionText", + min_count=1, + compact="expandedlicensing_additionText", + ) + # Specifies whether an additional text identifier has been marked as deprecated. + cls._add_property( + "expandedlicensing_isDeprecatedAdditionId", + BooleanProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/isDeprecatedAdditionId", + compact="expandedlicensing_isDeprecatedAdditionId", + ) + # Identifies all the text and metadata associated with a license in the license + # XML format. + cls._add_property( + "expandedlicensing_licenseXml", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/licenseXml", + compact="expandedlicensing_licenseXml", + ) + # Specifies the licenseId that is preferred to be used in place of a deprecated + # License or LicenseAddition. + cls._add_property( + "expandedlicensing_obsoletedBy", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/obsoletedBy", + compact="expandedlicensing_obsoletedBy", + ) + # Contains a URL where the License or LicenseAddition can be found in use. + cls._add_property( + "expandedlicensing_seeAlso", + ListProp(AnyURIProp()), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/seeAlso", + compact="expandedlicensing_seeAlso", + ) + # Identifies the full text of a LicenseAddition, in SPDX templating format. + cls._add_property( + "expandedlicensing_standardAdditionTemplate", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/standardAdditionTemplate", + compact="expandedlicensing_standardAdditionTemplate", + ) + + +# A license exception that is listed on the SPDX Exceptions list. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/ListedLicenseException", compact_type="expandedlicensing_ListedLicenseException", abstract=False) +class expandedlicensing_ListedLicenseException(expandedlicensing_LicenseAddition): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the SPDX License List version in which this license or exception + # identifier was deprecated. + cls._add_property( + "expandedlicensing_deprecatedVersion", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/deprecatedVersion", + compact="expandedlicensing_deprecatedVersion", + ) + # Specifies the SPDX License List version in which this ListedLicense or + # ListedLicenseException identifier was first added. + cls._add_property( + "expandedlicensing_listVersionAdded", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/listVersionAdded", + compact="expandedlicensing_listVersionAdded", + ) + + +# A property name with an associated value. +@register("https://spdx.org/rdf/3.0.0/terms/Extension/CdxPropertyEntry", compact_type="extension_CdxPropertyEntry", abstract=False) +class extension_CdxPropertyEntry(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A name used in a CdxExtension name-value pair. + cls._add_property( + "extension_cdxPropName", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Extension/cdxPropName", + min_count=1, + compact="extension_cdxPropName", + ) + # A value used in a CdxExtension name-value pair. + cls._add_property( + "extension_cdxPropValue", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Extension/cdxPropValue", + compact="extension_cdxPropValue", + ) + + +# A characterization of some aspect of an Element that is associated with the Element in a generalized fashion. +@register("https://spdx.org/rdf/3.0.0/terms/Extension/Extension", compact_type="extension_Extension", abstract=True) +class extension_Extension(SHACLExtensibleObject, SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + +# Specifies the CVSS base, temporal, threat, or environmental severity type. +@register("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType", compact_type="security_CvssSeverityType", abstract=False) +class security_CvssSeverityType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "critical": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical", + "high": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high", + "low": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low", + "medium": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium", + "none": "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none", + } + # When a CVSS score is between 9.0 - 10.0 + critical = "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical" + # When a CVSS score is between 7.0 - 8.9 + high = "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high" + # When a CVSS score is between 0 - 3.9 + low = "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low" + # When a CVSS score is between 4 - 6.9 + medium = "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium" + # When a CVSS score is 0 + none = "https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none" + + +# Specifies the exploit catalog type. +@register("https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType", compact_type="security_ExploitCatalogType", abstract=False) +class security_ExploitCatalogType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "kev": "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/kev", + "other": "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/other", + } + # CISA's Known Exploited Vulnerability (KEV) Catalog + kev = "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/kev" + # Other exploit catalogs + other = "https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/other" + + +# Specifies the SSVC decision type. +@register("https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType", compact_type="security_SsvcDecisionType", abstract=False) +class security_SsvcDecisionType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "act": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/act", + "attend": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/attend", + "track": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/track", + "trackStar": "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/trackStar", + } + # The vulnerability requires attention from the organization's internal, supervisory-level and leadership-level individuals. Necessary actions include requesting assistance or information about the vulnerability, as well as publishing a notification either internally and/or externally. Typically, internal groups would meet to determine the overall response and then execute agreed upon actions. CISA recommends remediating Act vulnerabilities as soon as possible. + act = "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/act" + # The vulnerability requires attention from the organization's internal, supervisory-level individuals. Necessary actions include requesting assistance or information about the vulnerability, and may involve publishing a notification either internally and/or externally. CISA recommends remediating Attend vulnerabilities sooner than standard update timelines. + attend = "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/attend" + # The vulnerability does not require action at this time. The organization would continue to track the vulnerability and reassess it if new information becomes available. CISA recommends remediating Track vulnerabilities within standard update timelines. + track = "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/track" + # ("Track*" in the SSVC spec) The vulnerability contains specific characteristics that may require closer monitoring for changes. CISA recommends remediating Track* vulnerabilities within standard update timelines. + trackStar = "https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/trackStar" + + +# Specifies the VEX justification type. +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType", compact_type="security_VexJustificationType", abstract=False) +class security_VexJustificationType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "componentNotPresent": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/componentNotPresent", + "inlineMitigationsAlreadyExist": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/inlineMitigationsAlreadyExist", + "vulnerableCodeCannotBeControlledByAdversary": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeCannotBeControlledByAdversary", + "vulnerableCodeNotInExecutePath": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotInExecutePath", + "vulnerableCodeNotPresent": "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotPresent", + } + # The software is not affected because the vulnerable component is not in the product. + componentNotPresent = "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/componentNotPresent" + # Built-in inline controls or mitigations prevent an adversary from leveraging the vulnerability. + inlineMitigationsAlreadyExist = "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/inlineMitigationsAlreadyExist" + # The vulnerable component is present, and the component contains the vulnerable code. However, vulnerable code is used in such a way that an attacker cannot mount any anticipated attack. + vulnerableCodeCannotBeControlledByAdversary = "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeCannotBeControlledByAdversary" + # The affected code is not reachable through the execution of the code, including non-anticipated states of the product. + vulnerableCodeNotInExecutePath = "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotInExecutePath" + # The product is not affected because the code underlying the vulnerability is not present in the product. + vulnerableCodeNotPresent = "https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotPresent" + + +# Abstract ancestor class for all vulnerability assessments +@register("https://spdx.org/rdf/3.0.0/terms/Security/VulnAssessmentRelationship", compact_type="security_VulnAssessmentRelationship", abstract=True) +class security_VulnAssessmentRelationship(Relationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Identifies who or what supplied the artifact or VulnAssessmentRelationship + # referenced by the Element. + cls._add_property( + "suppliedBy", + ObjectProp(Agent, False), + iri="https://spdx.org/rdf/3.0.0/terms/Core/suppliedBy", + compact="suppliedBy", + ) + # Specifies an Element contained in a piece of software where a vulnerability was + # found. + cls._add_property( + "security_assessedElement", + ObjectProp(Element, False), + iri="https://spdx.org/rdf/3.0.0/terms/Security/assessedElement", + compact="security_assessedElement", + ) + # Specifies a time when a vulnerability assessment was modified + cls._add_property( + "security_modifiedTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/modifiedTime", + compact="security_modifiedTime", + ) + # Specifies the time when a vulnerability was published. + cls._add_property( + "security_publishedTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/publishedTime", + compact="security_publishedTime", + ) + # Specified the time and date when a vulnerability was withdrawn. + cls._add_property( + "security_withdrawnTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/withdrawnTime", + compact="security_withdrawnTime", + ) + + +# Abstract class representing a license combination consisting of one or more +# licenses (optionally including additional text), which may be combined +# according to the SPDX license expression syntax. +@register("https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/AnyLicenseInfo", compact_type="simplelicensing_AnyLicenseInfo", abstract=True) +class simplelicensing_AnyLicenseInfo(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# An SPDX Element containing an SPDX license expression string. +@register("https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/LicenseExpression", compact_type="simplelicensing_LicenseExpression", abstract=False) +class simplelicensing_LicenseExpression(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Maps a LicenseRef or AdditionRef string for a Custom License or a Custom + # License Addition to its URI ID. + cls._add_property( + "simplelicensing_customIdToUri", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/customIdToUri", + compact="simplelicensing_customIdToUri", + ) + # A string in the license expression format. + cls._add_property( + "simplelicensing_licenseExpression", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/licenseExpression", + min_count=1, + compact="simplelicensing_licenseExpression", + ) + # The version of the SPDX License List used in the license expression. + cls._add_property( + "simplelicensing_licenseListVersion", + StringProp(pattern=r"^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$",), + iri="https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/licenseListVersion", + compact="simplelicensing_licenseListVersion", + ) + + +# A license or addition that is not listed on the SPDX License List. +@register("https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/SimpleLicensingText", compact_type="simplelicensing_SimpleLicensingText", abstract=False) +class simplelicensing_SimpleLicensingText(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Identifies the full text of a License or Addition. + cls._add_property( + "simplelicensing_licenseText", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/licenseText", + min_count=1, + compact="simplelicensing_licenseText", + ) + + +# A canonical, unique, immutable identifier +@register("https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifier", compact_type="software_ContentIdentifier", abstract=False) +class software_ContentIdentifier(IntegrityMethod): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the type of the content identifier. + cls._add_property( + "software_contentIdentifierType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/gitoid", "gitoid"), + ("https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/swhid", "swhid"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Software/contentIdentifierType", + min_count=1, + compact="software_contentIdentifierType", + ) + # Specifies the value of the content identifier. + cls._add_property( + "software_contentIdentifierValue", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/contentIdentifierValue", + min_count=1, + compact="software_contentIdentifierValue", + ) + + +# Specifies the type of a content identifier. +@register("https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType", compact_type="software_ContentIdentifierType", abstract=False) +class software_ContentIdentifierType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "gitoid": "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/gitoid", + "swhid": "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/swhid", + } + # Gitoid stands for [Git Object ID](https://git-scm.com/book/en/v2/Git-Internals-Git-Objects) and a gitoid of type blob is a unique hash of a binary artifact. A gitoid may represent the software [Artifact ID](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#artifact-id) or the [OmniBOR Identifier](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#omnibor-identifier) for the software artifact's associated [OmniBOR Document](https://github.com/omnibor/spec/blob/main/spec/SPEC.md#omnibor-document). + gitoid = "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/gitoid" + # SoftWare Hash IDentifier, persistent intrinsic identifiers for digital artifacts. The syntax of the identifiers is defined in the [SWHID specification](https://www.swhid.org/specification/v1.1/4.Syntax) and in the case of filess they typically look like `swh:1:cnt:94a9ed024d3859793618152ea559a168bbcbb5e2`. + swhid = "https://spdx.org/rdf/3.0.0/terms/Software/ContentIdentifierType/swhid" + + +# Enumeration of the different kinds of SPDX file. +@register("https://spdx.org/rdf/3.0.0/terms/Software/FileKindType", compact_type="software_FileKindType", abstract=False) +class software_FileKindType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "directory": "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/directory", + "file": "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/file", + } + # The file represents a directory and all content stored in that + directory = "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/directory" + # The file represents a single file (default). + file = "https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/file" + + +# Provides a set of values to be used to describe the common types of SBOMs that +# tools may create. +@register("https://spdx.org/rdf/3.0.0/terms/Software/SbomType", compact_type="software_SbomType", abstract=False) +class software_SbomType(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "analyzed": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/analyzed", + "build": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/build", + "deployed": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/deployed", + "design": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/design", + "runtime": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/runtime", + "source": "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/source", + } + # SBOM generated through analysis of artifacts (e.g., executables, packages, containers, and virtual machine images) after its build. Such analysis generally requires a variety of heuristics. In some contexts, this may also be referred to as a "3rd party" SBOM. + analyzed = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/analyzed" + # SBOM generated as part of the process of building the software to create a releasable artifact (e.g., executable or package) from data such as source files, dependencies, built components, build process ephemeral data, and other SBOMs. + build = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/build" + # SBOM provides an inventory of software that is present on a system. This may be an assembly of other SBOMs that combines analysis of configuration options, and examination of execution behavior in a (potentially simulated) deployment environment. + deployed = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/deployed" + # SBOM of intended, planned software project or product with included components (some of which may not yet exist) for a new software artifact. + design = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/design" + # SBOM generated through instrumenting the system running the software, to capture only components present in the system, as well as external call-outs or dynamically loaded components. In some contexts, this may also be referred to as an "Instrumented" or "Dynamic" SBOM. + runtime = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/runtime" + # SBOM created directly from the development environment, source files, and included dependencies used to build an product artifact. + source = "https://spdx.org/rdf/3.0.0/terms/Software/SbomType/source" + + +# Provides information about the primary purpose of an Element. +@register("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose", compact_type="software_SoftwarePurpose", abstract=False) +class software_SoftwarePurpose(SHACLObject): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + "application": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application", + "archive": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive", + "bom": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom", + "configuration": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration", + "container": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container", + "data": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data", + "device": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device", + "deviceDriver": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver", + "diskImage": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage", + "documentation": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation", + "evidence": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence", + "executable": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable", + "file": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file", + "filesystemImage": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage", + "firmware": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware", + "framework": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework", + "install": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install", + "library": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library", + "manifest": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest", + "model": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model", + "module": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module", + "operatingSystem": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem", + "other": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other", + "patch": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch", + "platform": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform", + "requirement": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement", + "source": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source", + "specification": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification", + "test": "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test", + } + # the Element is a software application + application = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application" + # the Element is an archived collection of one or more files (.tar, .zip, etc) + archive = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive" + # Element is a bill of materials + bom = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom" + # Element is configuration data + configuration = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration" + # the Element is a container image which can be used by a container runtime application + container = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container" + # Element is data + data = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data" + # the Element refers to a chipset, processor, or electronic board + device = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device" + # Element represents software that controls hardware devices + deviceDriver = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver" + # the Element refers to a disk image that can be written to a disk, booted in a VM, etc. A disk image typically contains most or all of the components necessary to boot, such as bootloaders, kernels, firmware, userspace, etc. + diskImage = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage" + # Element is documentation + documentation = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation" + # the Element is the evidence that a specification or requirement has been fulfilled + evidence = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence" + # Element is an Artifact that can be run on a computer + executable = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable" + # the Element is a single file which can be independently distributed (configuration file, statically linked binary, Kubernetes deployment, etc) + file = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file" + # the Element is a file system image that can be written to a disk (or virtual) partition + filesystemImage = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage" + # the Element provides low level control over a device's hardware + firmware = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware" + # the Element is a software framework + framework = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework" + # the Element is used to install software on disk + install = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install" + # the Element is a software library + library = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library" + # the Element is a software manifest + manifest = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest" + # the Element is a machine learning or artificial intelligence model + model = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model" + # the Element is a module of a piece of software + module = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module" + # the Element is an operating system + operatingSystem = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem" + # the Element doesn't fit into any of the other categories + other = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other" + # Element contains a set of changes to update, fix, or improve another Element + patch = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch" + # Element represents a runtime environment + platform = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform" + # the Element provides a requirement needed as input for another Element + requirement = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement" + # the Element is a single or a collection of source files + source = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source" + # the Element is a plan, guideline or strategy how to create, perform or analyse an application + specification = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification" + # The Element is a test used to verify functionality on an software element + test = "https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test" + + +# Class that describes a build instance of software/artifacts. +@register("https://spdx.org/rdf/3.0.0/terms/Build/Build", compact_type="build_Build", abstract=False) +class build_Build(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Property that describes the time at which a build stops. + cls._add_property( + "build_buildEndTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Build/buildEndTime", + compact="build_buildEndTime", + ) + # A buildId is a locally unique identifier used by a builder to identify a unique + # instance of a build produced by it. + cls._add_property( + "build_buildId", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Build/buildId", + compact="build_buildId", + ) + # Property describing the start time of a build. + cls._add_property( + "build_buildStartTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Build/buildStartTime", + compact="build_buildStartTime", + ) + # A buildType is a hint that is used to indicate the toolchain, platform, or + # infrastructure that the build was invoked on. + cls._add_property( + "build_buildType", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Build/buildType", + min_count=1, + compact="build_buildType", + ) + # Property that describes the digest of the build configuration file used to + # invoke a build. + cls._add_property( + "build_configSourceDigest", + ListProp(ObjectProp(Hash, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Build/configSourceDigest", + compact="build_configSourceDigest", + ) + # Property describes the invocation entrypoint of a build. + cls._add_property( + "build_configSourceEntrypoint", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Build/configSourceEntrypoint", + compact="build_configSourceEntrypoint", + ) + # Property that describes the URI of the build configuration source file. + cls._add_property( + "build_configSourceUri", + ListProp(AnyURIProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Build/configSourceUri", + compact="build_configSourceUri", + ) + # Property describing the session in which a build is invoked. + cls._add_property( + "build_environment", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Build/environment", + compact="build_environment", + ) + # Property describing the parameters used in an instance of a build. + cls._add_property( + "build_parameters", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Build/parameters", + compact="build_parameters", + ) + + +# Agent represents anything with the potential to act on a system. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Agent", compact_type="Agent", abstract=False) +class Agent(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# An assertion made in relation to one or more elements. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Annotation", compact_type="Annotation", abstract=False) +class Annotation(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Describes the type of annotation. + cls._add_property( + "annotationType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/AnnotationType/review", "review"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/annotationType", + min_count=1, + compact="annotationType", + ) + # Specifies the media type of an Element or Property. + cls._add_property( + "contentType", + StringProp(pattern=r"^[^\/]+\/[^\/]+$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/contentType", + compact="contentType", + ) + # Commentary on an assertion that an annotator has made. + cls._add_property( + "statement", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/statement", + compact="statement", + ) + # An Element an annotator has made an assertion about. + cls._add_property( + "subject", + ObjectProp(Element, True), + iri="https://spdx.org/rdf/3.0.0/terms/Core/subject", + min_count=1, + compact="subject", + ) + + +# A distinct article or unit within the digital domain. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Artifact", compact_type="Artifact", abstract=True) +class Artifact(Element): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the time an artifact was built. + cls._add_property( + "builtTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/builtTime", + compact="builtTime", + ) + # Identifies from where or whom the Element originally came. + cls._add_property( + "originatedBy", + ListProp(ObjectProp(Agent, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Core/originatedBy", + compact="originatedBy", + ) + # Specifies the time an artifact was released. + cls._add_property( + "releaseTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/releaseTime", + compact="releaseTime", + ) + # The name of a relevant standard that may apply to an artifact. + cls._add_property( + "standardName", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Core/standardName", + compact="standardName", + ) + # Identifies who or what supplied the artifact or VulnAssessmentRelationship + # referenced by the Element. + cls._add_property( + "suppliedBy", + ObjectProp(Agent, False), + iri="https://spdx.org/rdf/3.0.0/terms/Core/suppliedBy", + compact="suppliedBy", + ) + # Specifies the level of support associated with an artifact. + cls._add_property( + "supportLevel", + ListProp(EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/deployed", "deployed"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/development", "development"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/endOfSupport", "endOfSupport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/limitedSupport", "limitedSupport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noAssertion", "noAssertion"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/noSupport", "noSupport"), + ("https://spdx.org/rdf/3.0.0/terms/Core/SupportType/support", "support"), + ])), + iri="https://spdx.org/rdf/3.0.0/terms/Core/supportLevel", + compact="supportLevel", + ) + # Specifies until when the artifact can be used before its usage needs to be + # reassessed. + cls._add_property( + "validUntilTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Core/validUntilTime", + compact="validUntilTime", + ) + + +# A collection of Elements that have a shared context. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Bundle", compact_type="Bundle", abstract=False) +class Bundle(ElementCollection): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Gives information about the circumstances or unifying properties + # that Elements of the bundle have been assembled under. + cls._add_property( + "context", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/context", + compact="context", + ) + + +# A mathematically calculated representation of a grouping of data. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Hash", compact_type="Hash", abstract=False) +class Hash(IntegrityMethod): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the algorithm used for calculating the hash value. + cls._add_property( + "algorithm", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b256", "blake2b256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b384", "blake2b384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake2b512", "blake2b512"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/blake3", "blake3"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsDilithium", "crystalsDilithium"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/crystalsKyber", "crystalsKyber"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/falcon", "falcon"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md2", "md2"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md4", "md4"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md5", "md5"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/md6", "md6"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha1", "sha1"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha224", "sha224"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha256", "sha256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha384", "sha384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_224", "sha3_224"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_256", "sha3_256"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_384", "sha3_384"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha3_512", "sha3_512"), + ("https://spdx.org/rdf/3.0.0/terms/Core/HashAlgorithm/sha512", "sha512"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/algorithm", + min_count=1, + compact="algorithm", + ) + # The result of applying a hash algorithm to an Element. + cls._add_property( + "hashValue", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Core/hashValue", + min_count=1, + compact="hashValue", + ) + + +# Provide context for a relationship that occurs in the lifecycle. +@register("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopedRelationship", compact_type="LifecycleScopedRelationship", abstract=False) +class LifecycleScopedRelationship(Relationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Capture the scope of information about a specific relationship between elements. + cls._add_property( + "scope", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/build", "build"), + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/design", "design"), + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/development", "development"), + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/runtime", "runtime"), + ("https://spdx.org/rdf/3.0.0/terms/Core/LifecycleScopeType/test", "test"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Core/scope", + compact="scope", + ) + + +# A group of people who work together in an organized way for a shared purpose. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Organization", compact_type="Organization", abstract=False) +class Organization(Agent): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# An individual human being. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Person", compact_type="Person", abstract=False) +class Person(Agent): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# A software agent. +@register("https://spdx.org/rdf/3.0.0/terms/Core/SoftwareAgent", compact_type="SoftwareAgent", abstract=False) +class SoftwareAgent(Agent): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Portion of an AnyLicenseInfo representing a set of licensing information +# where all elements apply. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/ConjunctiveLicenseSet", compact_type="expandedlicensing_ConjunctiveLicenseSet", abstract=False) +class expandedlicensing_ConjunctiveLicenseSet(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A license expression participating in a license set. + cls._add_property( + "expandedlicensing_member", + ListProp(ObjectProp(simplelicensing_AnyLicenseInfo, False)), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/member", + min_count=2, + compact="expandedlicensing_member", + ) + + +# A license addition that is not listed on the SPDX Exceptions List. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/CustomLicenseAddition", compact_type="expandedlicensing_CustomLicenseAddition", abstract=False) +class expandedlicensing_CustomLicenseAddition(expandedlicensing_LicenseAddition): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Portion of an AnyLicenseInfo representing a set of licensing information where +# only one of the elements applies. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/DisjunctiveLicenseSet", compact_type="expandedlicensing_DisjunctiveLicenseSet", abstract=False) +class expandedlicensing_DisjunctiveLicenseSet(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A license expression participating in a license set. + cls._add_property( + "expandedlicensing_member", + ListProp(ObjectProp(simplelicensing_AnyLicenseInfo, False)), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/member", + min_count=2, + compact="expandedlicensing_member", + ) + + +# Abstract class representing a License or an OrLaterOperator. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/ExtendableLicense", compact_type="expandedlicensing_ExtendableLicense", abstract=True) +class expandedlicensing_ExtendableLicense(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# A concrete subclass of AnyLicenseInfo used by Individuals in the +# ExpandedLicensing profile. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/IndividualLicensingInfo", compact_type="expandedlicensing_IndividualLicensingInfo", abstract=False) +class expandedlicensing_IndividualLicensingInfo(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + "NoAssertionLicense": "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoAssertionLicense", + "NoneLicense": "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoneLicense", + } + # An Individual Value for License when no assertion can be made about its actual + # value. + NoAssertionLicense = "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoAssertionLicense" + # An Individual Value for License where the SPDX data creator determines that no + # license is present. + NoneLicense = "https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/NoneLicense" + + +# Abstract class for the portion of an AnyLicenseInfo representing a license. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/License", compact_type="expandedlicensing_License", abstract=True) +class expandedlicensing_License(expandedlicensing_ExtendableLicense): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies whether a license or additional text identifier has been marked as + # deprecated. + cls._add_property( + "expandedlicensing_isDeprecatedLicenseId", + BooleanProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/isDeprecatedLicenseId", + compact="expandedlicensing_isDeprecatedLicenseId", + ) + # Specifies whether the License is listed as free by the + # [Free Software Foundation (FSF)](https://fsf.org). + cls._add_property( + "expandedlicensing_isFsfLibre", + BooleanProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/isFsfLibre", + compact="expandedlicensing_isFsfLibre", + ) + # Specifies whether the License is listed as approved by the + # [Open Source Initiative (OSI)](https://opensource.org). + cls._add_property( + "expandedlicensing_isOsiApproved", + BooleanProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/isOsiApproved", + compact="expandedlicensing_isOsiApproved", + ) + # Identifies all the text and metadata associated with a license in the license + # XML format. + cls._add_property( + "expandedlicensing_licenseXml", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/licenseXml", + compact="expandedlicensing_licenseXml", + ) + # Specifies the licenseId that is preferred to be used in place of a deprecated + # License or LicenseAddition. + cls._add_property( + "expandedlicensing_obsoletedBy", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/obsoletedBy", + compact="expandedlicensing_obsoletedBy", + ) + # Contains a URL where the License or LicenseAddition can be found in use. + cls._add_property( + "expandedlicensing_seeAlso", + ListProp(AnyURIProp()), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/seeAlso", + compact="expandedlicensing_seeAlso", + ) + # Provides a License author's preferred text to indicate that a file is covered + # by the License. + cls._add_property( + "expandedlicensing_standardLicenseHeader", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/standardLicenseHeader", + compact="expandedlicensing_standardLicenseHeader", + ) + # Identifies the full text of a License, in SPDX templating format. + cls._add_property( + "expandedlicensing_standardLicenseTemplate", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/standardLicenseTemplate", + compact="expandedlicensing_standardLicenseTemplate", + ) + # Identifies the full text of a License or Addition. + cls._add_property( + "simplelicensing_licenseText", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/SimpleLicensing/licenseText", + min_count=1, + compact="simplelicensing_licenseText", + ) + + +# A license that is listed on the SPDX License List. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/ListedLicense", compact_type="expandedlicensing_ListedLicense", abstract=False) +class expandedlicensing_ListedLicense(expandedlicensing_License): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the SPDX License List version in which this license or exception + # identifier was deprecated. + cls._add_property( + "expandedlicensing_deprecatedVersion", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/deprecatedVersion", + compact="expandedlicensing_deprecatedVersion", + ) + # Specifies the SPDX License List version in which this ListedLicense or + # ListedLicenseException identifier was first added. + cls._add_property( + "expandedlicensing_listVersionAdded", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/listVersionAdded", + compact="expandedlicensing_listVersionAdded", + ) + + +# Portion of an AnyLicenseInfo representing this version, or any later version, +# of the indicated License. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/OrLaterOperator", compact_type="expandedlicensing_OrLaterOperator", abstract=False) +class expandedlicensing_OrLaterOperator(expandedlicensing_ExtendableLicense): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A License participating in an 'or later' model. + cls._add_property( + "expandedlicensing_subjectLicense", + ObjectProp(expandedlicensing_License, True), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/subjectLicense", + min_count=1, + compact="expandedlicensing_subjectLicense", + ) + + +# Portion of an AnyLicenseInfo representing a License which has additional +# text applied to it. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/WithAdditionOperator", compact_type="expandedlicensing_WithAdditionOperator", abstract=False) +class expandedlicensing_WithAdditionOperator(simplelicensing_AnyLicenseInfo): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # A LicenseAddition participating in a 'with addition' model. + cls._add_property( + "expandedlicensing_subjectAddition", + ObjectProp(expandedlicensing_LicenseAddition, True), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/subjectAddition", + min_count=1, + compact="expandedlicensing_subjectAddition", + ) + # A License participating in a 'with addition' model. + cls._add_property( + "expandedlicensing_subjectExtendableLicense", + ObjectProp(expandedlicensing_ExtendableLicense, True), + iri="https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/subjectExtendableLicense", + min_count=1, + compact="expandedlicensing_subjectExtendableLicense", + ) + + +# A type of extension consisting of a list of name value pairs. +@register("https://spdx.org/rdf/3.0.0/terms/Extension/CdxPropertiesExtension", compact_type="extension_CdxPropertiesExtension", abstract=False) +class extension_CdxPropertiesExtension(extension_Extension): + NODE_KIND = NodeKind.BlankNodeOrIRI + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides a map of a property names to a values. + cls._add_property( + "extension_cdxProperty", + ListProp(ObjectProp(extension_CdxPropertyEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Extension/cdxProperty", + min_count=1, + compact="extension_cdxProperty", + ) + + +# Provides a CVSS version 2.0 assessment for a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/CvssV2VulnAssessmentRelationship", compact_type="security_CvssV2VulnAssessmentRelationship", abstract=False) +class security_CvssV2VulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides a numerical (0-10) representation of the severity of a vulnerability. + cls._add_property( + "security_score", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/score", + min_count=1, + compact="security_score", + ) + # Specifies the CVSS vector string for a vulnerability. + cls._add_property( + "security_vectorString", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/vectorString", + min_count=1, + compact="security_vectorString", + ) + + +# Provides a CVSS version 3 assessment for a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/CvssV3VulnAssessmentRelationship", compact_type="security_CvssV3VulnAssessmentRelationship", abstract=False) +class security_CvssV3VulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides a numerical (0-10) representation of the severity of a vulnerability. + cls._add_property( + "security_score", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/score", + min_count=1, + compact="security_score", + ) + # Specifies the CVSS qualitative severity rating of a vulnerability in relation to a piece of software. + cls._add_property( + "security_severity", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical", "critical"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high", "high"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low", "low"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium", "medium"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none", "none"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Security/severity", + min_count=1, + compact="security_severity", + ) + # Specifies the CVSS vector string for a vulnerability. + cls._add_property( + "security_vectorString", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/vectorString", + min_count=1, + compact="security_vectorString", + ) + + +# Provides a CVSS version 4 assessment for a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/CvssV4VulnAssessmentRelationship", compact_type="security_CvssV4VulnAssessmentRelationship", abstract=False) +class security_CvssV4VulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides a numerical (0-10) representation of the severity of a vulnerability. + cls._add_property( + "security_score", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/score", + min_count=1, + compact="security_score", + ) + # Specifies the CVSS qualitative severity rating of a vulnerability in relation to a piece of software. + cls._add_property( + "security_severity", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/critical", "critical"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/high", "high"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/low", "low"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/medium", "medium"), + ("https://spdx.org/rdf/3.0.0/terms/Security/CvssSeverityType/none", "none"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Security/severity", + min_count=1, + compact="security_severity", + ) + # Specifies the CVSS vector string for a vulnerability. + cls._add_property( + "security_vectorString", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/vectorString", + min_count=1, + compact="security_vectorString", + ) + + +# Provides an EPSS assessment for a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/EpssVulnAssessmentRelationship", compact_type="security_EpssVulnAssessmentRelationship", abstract=False) +class security_EpssVulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # The percentile of the current probability score. + cls._add_property( + "security_percentile", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/percentile", + min_count=1, + compact="security_percentile", + ) + # A probability score between 0 and 1 of a vulnerability being exploited. + cls._add_property( + "security_probability", + FloatProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/probability", + min_count=1, + compact="security_probability", + ) + # Specifies the time when a vulnerability was published. + cls._add_property( + "security_publishedTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/publishedTime", + min_count=1, + compact="security_publishedTime", + ) + + +# Provides an exploit assessment of a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogVulnAssessmentRelationship", compact_type="security_ExploitCatalogVulnAssessmentRelationship", abstract=False) +class security_ExploitCatalogVulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies the exploit catalog type. + cls._add_property( + "security_catalogType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/kev", "kev"), + ("https://spdx.org/rdf/3.0.0/terms/Security/ExploitCatalogType/other", "other"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Security/catalogType", + min_count=1, + compact="security_catalogType", + ) + # Describe that a CVE is known to have an exploit because it's been listed in an exploit catalog. + cls._add_property( + "security_exploited", + BooleanProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/exploited", + min_count=1, + compact="security_exploited", + ) + # Provides the location of an exploit catalog. + cls._add_property( + "security_locator", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/locator", + min_count=1, + compact="security_locator", + ) + + +# Provides an SSVC assessment for a vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/SsvcVulnAssessmentRelationship", compact_type="security_SsvcVulnAssessmentRelationship", abstract=False) +class security_SsvcVulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provide the enumeration of possible decisions in the Stakeholder-Specific Vulnerability Categorization (SSVC) decision tree [https://www.cisa.gov/sites/default/files/publications/cisa-ssvc-guide%20508c.pdf](https://www.cisa.gov/sites/default/files/publications/cisa-ssvc-guide%20508c.pdf) + cls._add_property( + "security_decisionType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/act", "act"), + ("https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/attend", "attend"), + ("https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/track", "track"), + ("https://spdx.org/rdf/3.0.0/terms/Security/SsvcDecisionType/trackStar", "trackStar"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Security/decisionType", + min_count=1, + compact="security_decisionType", + ) + + +# Asbtract ancestor class for all VEX relationships +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexVulnAssessmentRelationship", compact_type="security_VexVulnAssessmentRelationship", abstract=True) +class security_VexVulnAssessmentRelationship(security_VulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Conveys information about how VEX status was determined. + cls._add_property( + "security_statusNotes", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/statusNotes", + compact="security_statusNotes", + ) + # Specifies the version of a VEX statement. + cls._add_property( + "security_vexVersion", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/vexVersion", + compact="security_vexVersion", + ) + + +# Specifies a vulnerability and its associated information. +@register("https://spdx.org/rdf/3.0.0/terms/Security/Vulnerability", compact_type="security_Vulnerability", abstract=False) +class security_Vulnerability(Artifact): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Specifies a time when a vulnerability assessment was modified + cls._add_property( + "security_modifiedTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/modifiedTime", + compact="security_modifiedTime", + ) + # Specifies the time when a vulnerability was published. + cls._add_property( + "security_publishedTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/publishedTime", + compact="security_publishedTime", + ) + # Specified the time and date when a vulnerability was withdrawn. + cls._add_property( + "security_withdrawnTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/withdrawnTime", + compact="security_withdrawnTime", + ) + + +# A distinct article or unit related to Software. +@register("https://spdx.org/rdf/3.0.0/terms/Software/SoftwareArtifact", compact_type="software_SoftwareArtifact", abstract=True) +class software_SoftwareArtifact(Artifact): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides additional purpose information of the software artifact. + cls._add_property( + "software_additionalPurpose", + ListProp(EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application", "application"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive", "archive"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom", "bom"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration", "configuration"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container", "container"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data", "data"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device", "device"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver", "deviceDriver"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage", "diskImage"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation", "documentation"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence", "evidence"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable", "executable"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file", "file"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage", "filesystemImage"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware", "firmware"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework", "framework"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install", "install"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library", "library"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest", "manifest"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model", "model"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module", "module"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem", "operatingSystem"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch", "patch"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform", "platform"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement", "requirement"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source", "source"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification", "specification"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test", "test"), + ])), + iri="https://spdx.org/rdf/3.0.0/terms/Software/additionalPurpose", + compact="software_additionalPurpose", + ) + # Provides a place for the SPDX data creator to record acknowledgement text for + # a software Package, File or Snippet. + cls._add_property( + "software_attributionText", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Software/attributionText", + compact="software_attributionText", + ) + # A canonical, unique, immutable identifier of the artifact content, that may be + # used for verifying its identity and/or integrity. + cls._add_property( + "software_contentIdentifier", + ListProp(ObjectProp(software_ContentIdentifier, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Software/contentIdentifier", + compact="software_contentIdentifier", + ) + # Identifies the text of one or more copyright notices for a software Package, + # File or Snippet, if any. + cls._add_property( + "software_copyrightText", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/copyrightText", + compact="software_copyrightText", + ) + # Provides information about the primary purpose of the software artifact. + cls._add_property( + "software_primaryPurpose", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/application", "application"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/archive", "archive"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/bom", "bom"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/configuration", "configuration"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/container", "container"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/data", "data"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/device", "device"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/deviceDriver", "deviceDriver"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/diskImage", "diskImage"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/documentation", "documentation"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/evidence", "evidence"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/executable", "executable"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/file", "file"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/filesystemImage", "filesystemImage"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/firmware", "firmware"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/framework", "framework"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/install", "install"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/library", "library"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/manifest", "manifest"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/model", "model"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/module", "module"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/operatingSystem", "operatingSystem"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/patch", "patch"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/platform", "platform"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/requirement", "requirement"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/source", "source"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/specification", "specification"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SoftwarePurpose/test", "test"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Software/primaryPurpose", + compact="software_primaryPurpose", + ) + + +# A container for a grouping of SPDX-3.0 content characterizing details +# (provenence, composition, licensing, etc.) about a product. +@register("https://spdx.org/rdf/3.0.0/terms/Core/Bom", compact_type="Bom", abstract=False) +class Bom(Bundle): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# A license that is not listed on the SPDX License List. +@register("https://spdx.org/rdf/3.0.0/terms/ExpandedLicensing/CustomLicense", compact_type="expandedlicensing_CustomLicense", abstract=False) +class expandedlicensing_CustomLicense(expandedlicensing_License): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Connects a vulnerability and an element designating the element as a product +# affected by the vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexAffectedVulnAssessmentRelationship", compact_type="security_VexAffectedVulnAssessmentRelationship", abstract=False) +class security_VexAffectedVulnAssessmentRelationship(security_VexVulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides advise on how to mitigate or remediate a vulnerability when a VEX product + # is affected by it. + cls._add_property( + "security_actionStatement", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/actionStatement", + compact="security_actionStatement", + ) + # Records the time when a recommended action was communicated in a VEX statement + # to mitigate a vulnerability. + cls._add_property( + "security_actionStatementTime", + ListProp(DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",)), + iri="https://spdx.org/rdf/3.0.0/terms/Security/actionStatementTime", + compact="security_actionStatementTime", + ) + + +# Links a vulnerability and elements representing products (in the VEX sense) where +# a fix has been applied and are no longer affected. +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexFixedVulnAssessmentRelationship", compact_type="security_VexFixedVulnAssessmentRelationship", abstract=False) +class security_VexFixedVulnAssessmentRelationship(security_VexVulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Links a vulnerability and one or more elements designating the latter as products +# not affected by the vulnerability. +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexNotAffectedVulnAssessmentRelationship", compact_type="security_VexNotAffectedVulnAssessmentRelationship", abstract=False) +class security_VexNotAffectedVulnAssessmentRelationship(security_VexVulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Explains why a VEX product is not affected by a vulnerability. It is an + # alternative in VexNotAffectedVulnAssessmentRelationship to the machine-readable + # justification label. + cls._add_property( + "security_impactStatement", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Security/impactStatement", + compact="security_impactStatement", + ) + # Timestamp of impact statement. + cls._add_property( + "security_impactStatementTime", + DateTimeStampProp(pattern=r"^\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\dZ$",), + iri="https://spdx.org/rdf/3.0.0/terms/Security/impactStatementTime", + compact="security_impactStatementTime", + ) + # Impact justification label to be used when linking a vulnerability to an element + # representing a VEX product with a VexNotAffectedVulnAssessmentRelationship + # relationship. + cls._add_property( + "security_justificationType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/componentNotPresent", "componentNotPresent"), + ("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/inlineMitigationsAlreadyExist", "inlineMitigationsAlreadyExist"), + ("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeCannotBeControlledByAdversary", "vulnerableCodeCannotBeControlledByAdversary"), + ("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotInExecutePath", "vulnerableCodeNotInExecutePath"), + ("https://spdx.org/rdf/3.0.0/terms/Security/VexJustificationType/vulnerableCodeNotPresent", "vulnerableCodeNotPresent"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Security/justificationType", + compact="security_justificationType", + ) + + +# Designates elements as products where the impact of a vulnerability is being +# investigated. +@register("https://spdx.org/rdf/3.0.0/terms/Security/VexUnderInvestigationVulnAssessmentRelationship", compact_type="security_VexUnderInvestigationVulnAssessmentRelationship", abstract=False) +class security_VexUnderInvestigationVulnAssessmentRelationship(security_VexVulnAssessmentRelationship): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + +# Refers to any object that stores content on a computer. +@register("https://spdx.org/rdf/3.0.0/terms/Software/File", compact_type="software_File", abstract=False) +class software_File(software_SoftwareArtifact): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides information about the content type of an Element. + cls._add_property( + "software_contentType", + StringProp(pattern=r"^[^\/]+\/[^\/]+$",), + iri="https://spdx.org/rdf/3.0.0/terms/Software/contentType", + compact="software_contentType", + ) + # Describes if a given file is a directory or non-directory kind of file. + cls._add_property( + "software_fileKind", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/directory", "directory"), + ("https://spdx.org/rdf/3.0.0/terms/Software/FileKindType/file", "file"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Software/fileKind", + compact="software_fileKind", + ) + + +# Refers to any unit of content that can be associated with a distribution of +# software. +@register("https://spdx.org/rdf/3.0.0/terms/Software/Package", compact_type="software_Package", abstract=False) +class software_Package(software_SoftwareArtifact): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Identifies the download Uniform Resource Identifier for the package at the time + # that the document was created. + cls._add_property( + "software_downloadLocation", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/downloadLocation", + compact="software_downloadLocation", + ) + # A place for the SPDX document creator to record a website that serves as the + # package's home page. + cls._add_property( + "software_homePage", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/homePage", + compact="software_homePage", + ) + # Provides a place for the SPDX data creator to record the package URL string + # (in accordance with the + # [package URL spec](https://github.com/package-url/purl-spec/blob/master/PURL-SPECIFICATION.rst)) + # for a software Package. + cls._add_property( + "software_packageUrl", + AnyURIProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/packageUrl", + compact="software_packageUrl", + ) + # Identify the version of a package. + cls._add_property( + "software_packageVersion", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/packageVersion", + compact="software_packageVersion", + ) + # Records any relevant background information or additional comments + # about the origin of the package. + cls._add_property( + "software_sourceInfo", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Software/sourceInfo", + compact="software_sourceInfo", + ) + + +# A collection of SPDX Elements describing a single package. +@register("https://spdx.org/rdf/3.0.0/terms/Software/Sbom", compact_type="software_Sbom", abstract=False) +class software_Sbom(Bom): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Provides information about the type of an SBOM. + cls._add_property( + "software_sbomType", + ListProp(EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/analyzed", "analyzed"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/build", "build"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/deployed", "deployed"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/design", "design"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/runtime", "runtime"), + ("https://spdx.org/rdf/3.0.0/terms/Software/SbomType/source", "source"), + ])), + iri="https://spdx.org/rdf/3.0.0/terms/Software/sbomType", + compact="software_sbomType", + ) + + +# Describes a certain part of a file. +@register("https://spdx.org/rdf/3.0.0/terms/Software/Snippet", compact_type="software_Snippet", abstract=False) +class software_Snippet(software_SoftwareArtifact): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Defines the byte range in the original host file that the snippet information + # applies to. + cls._add_property( + "software_byteRange", + ObjectProp(PositiveIntegerRange, False), + iri="https://spdx.org/rdf/3.0.0/terms/Software/byteRange", + compact="software_byteRange", + ) + # Defines the line range in the original host file that the snippet information + # applies to. + cls._add_property( + "software_lineRange", + ObjectProp(PositiveIntegerRange, False), + iri="https://spdx.org/rdf/3.0.0/terms/Software/lineRange", + compact="software_lineRange", + ) + # Defines the original host file that the snippet information applies to. + cls._add_property( + "software_snippetFromFile", + ObjectProp(software_File, True), + iri="https://spdx.org/rdf/3.0.0/terms/Software/snippetFromFile", + min_count=1, + compact="software_snippetFromFile", + ) + + +# Specifies an AI package and its associated information. +@register("https://spdx.org/rdf/3.0.0/terms/AI/AIPackage", compact_type="ai_AIPackage", abstract=False) +class ai_AIPackage(software_Package): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # States if a human is involved in the decisions of the AI software. + cls._add_property( + "ai_autonomyType", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no", "no"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion", "noAssertion"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes", "yes"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/AI/autonomyType", + compact="ai_autonomyType", + ) + # Captures the domain in which the AI package can be used. + cls._add_property( + "ai_domain", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/AI/domain", + compact="ai_domain", + ) + # Indicates the amount of energy consumed to train the AI model. + cls._add_property( + "ai_energyConsumption", + ObjectProp(ai_EnergyConsumption, False), + iri="https://spdx.org/rdf/3.0.0/terms/AI/energyConsumption", + compact="ai_energyConsumption", + ) + # Records a hyperparameter used to build the AI model contained in the AI + # package. + cls._add_property( + "ai_hyperparameter", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/hyperparameter", + compact="ai_hyperparameter", + ) + # Provides relevant information about the AI software, not including the model + # description. + cls._add_property( + "ai_informationAboutApplication", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/AI/informationAboutApplication", + compact="ai_informationAboutApplication", + ) + # Describes relevant information about different steps of the training process. + cls._add_property( + "ai_informationAboutTraining", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/AI/informationAboutTraining", + compact="ai_informationAboutTraining", + ) + # Captures a limitation of the AI software. + cls._add_property( + "ai_limitation", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/AI/limitation", + compact="ai_limitation", + ) + # Records the measurement of prediction quality of the AI model. + cls._add_property( + "ai_metric", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/metric", + compact="ai_metric", + ) + # Captures the threshold that was used for computation of a metric described in + # the metric field. + cls._add_property( + "ai_metricDecisionThreshold", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/AI/metricDecisionThreshold", + compact="ai_metricDecisionThreshold", + ) + # Describes all the preprocessing steps applied to the training data before the + # model training. + cls._add_property( + "ai_modelDataPreprocessing", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/AI/modelDataPreprocessing", + compact="ai_modelDataPreprocessing", + ) + # Describes methods that can be used to explain the model. + cls._add_property( + "ai_modelExplainability", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/AI/modelExplainability", + compact="ai_modelExplainability", + ) + # Records the results of general safety risk assessment of the AI system. + cls._add_property( + "ai_safetyRiskAssessment", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/high", "high"), + ("https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/low", "low"), + ("https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/medium", "medium"), + ("https://spdx.org/rdf/3.0.0/terms/AI/SafetyRiskAssessmentType/serious", "serious"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/AI/safetyRiskAssessment", + compact="ai_safetyRiskAssessment", + ) + # Captures a standard that is being complied with. + cls._add_property( + "ai_standardCompliance", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/AI/standardCompliance", + compact="ai_standardCompliance", + ) + # Records the type of the model used in the AI software. + cls._add_property( + "ai_typeOfModel", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/AI/typeOfModel", + compact="ai_typeOfModel", + ) + # Records if sensitive personal information is used during model training or + # could be used during the inference. + cls._add_property( + "ai_useSensitivePersonalInformation", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no", "no"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion", "noAssertion"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes", "yes"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/AI/useSensitivePersonalInformation", + compact="ai_useSensitivePersonalInformation", + ) + + +# Specifies a data package and its associated information. +@register("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetPackage", compact_type="dataset_DatasetPackage", abstract=False) +class dataset_DatasetPackage(software_Package): + NODE_KIND = NodeKind.BlankNodeOrIRI + ID_ALIAS = "spdxId" + NAMED_INDIVIDUALS = { + } + + @classmethod + def _register_props(cls): + super()._register_props() + # Describes the anonymization methods used. + cls._add_property( + "dataset_anonymizationMethodUsed", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/anonymizationMethodUsed", + compact="dataset_anonymizationMethodUsed", + ) + # Describes the confidentiality level of the data points contained in the dataset. + cls._add_property( + "dataset_confidentialityLevel", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/amber", "amber"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/clear", "clear"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/green", "green"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/ConfidentialityLevelType/red", "red"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/confidentialityLevel", + compact="dataset_confidentialityLevel", + ) + # Describes how the dataset was collected. + cls._add_property( + "dataset_dataCollectionProcess", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/dataCollectionProcess", + compact="dataset_dataCollectionProcess", + ) + # Describes the preprocessing steps that were applied to the raw data to create the given dataset. + cls._add_property( + "dataset_dataPreprocessing", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/dataPreprocessing", + compact="dataset_dataPreprocessing", + ) + # The field describes the availability of a dataset. + cls._add_property( + "dataset_datasetAvailability", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/clickthrough", "clickthrough"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/directDownload", "directDownload"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/query", "query"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/registration", "registration"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetAvailabilityType/scrapingScript", "scrapingScript"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/datasetAvailability", + compact="dataset_datasetAvailability", + ) + # Describes potentially noisy elements of the dataset. + cls._add_property( + "dataset_datasetNoise", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/datasetNoise", + compact="dataset_datasetNoise", + ) + # Captures the size of the dataset. + cls._add_property( + "dataset_datasetSize", + NonNegativeIntegerProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/datasetSize", + compact="dataset_datasetSize", + ) + # Describes the type of the given dataset. + cls._add_property( + "dataset_datasetType", + ListProp(EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/audio", "audio"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/categorical", "categorical"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/graph", "graph"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/image", "image"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/noAssertion", "noAssertion"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/numeric", "numeric"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/other", "other"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/sensor", "sensor"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/structured", "structured"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/syntactic", "syntactic"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/text", "text"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timeseries", "timeseries"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/timestamp", "timestamp"), + ("https://spdx.org/rdf/3.0.0/terms/Dataset/DatasetType/video", "video"), + ])), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/datasetType", + min_count=1, + compact="dataset_datasetType", + ) + # Describes a mechanism to update the dataset. + cls._add_property( + "dataset_datasetUpdateMechanism", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/datasetUpdateMechanism", + compact="dataset_datasetUpdateMechanism", + ) + # Describes if any sensitive personal information is present in the dataset. + cls._add_property( + "dataset_hasSensitivePersonalInformation", + EnumProp([ + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/no", "no"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/noAssertion", "noAssertion"), + ("https://spdx.org/rdf/3.0.0/terms/Core/PresenceType/yes", "yes"), + ]), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/hasSensitivePersonalInformation", + compact="dataset_hasSensitivePersonalInformation", + ) + # Describes what the given dataset should be used for. + cls._add_property( + "dataset_intendedUse", + StringProp(), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/intendedUse", + compact="dataset_intendedUse", + ) + # Records the biases that the dataset is known to encompass. + cls._add_property( + "dataset_knownBias", + ListProp(StringProp()), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/knownBias", + compact="dataset_knownBias", + ) + # Describes a sensor used for collecting the data. + cls._add_property( + "dataset_sensor", + ListProp(ObjectProp(DictionaryEntry, False)), + iri="https://spdx.org/rdf/3.0.0/terms/Dataset/sensor", + compact="dataset_sensor", + ) + + +"""Format Guard""" +# fmt: on + + +def main(): + import argparse + from pathlib import Path + + parser = argparse.ArgumentParser(description="Python SHACL model test") + parser.add_argument("infile", type=Path, help="Input file") + parser.add_argument("--print", action="store_true", help="Print object tree") + parser.add_argument("--outfile", type=Path, help="Output file") + + args = parser.parse_args() + + objectset = SHACLObjectSet() + with args.infile.open("r") as f: + d = JSONLDDeserializer() + d.read(f, objectset) + + if args.print: + print_tree(objectset.objects) + + if args.outfile: + with args.outfile.open("wb") as f: + s = JSONLDSerializer() + s.write(objectset, f) + + return 0 + + +if __name__ == "__main__": + sys.exit(main()) From patchwork Fri Jul 12 15:58:14 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46267 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 2BE98C3DA50 for ; Fri, 12 Jul 2024 16:03:23 +0000 (UTC) Received: from mail-oo1-f44.google.com (mail-oo1-f44.google.com [209.85.161.44]) by mx.groups.io with SMTP id smtpd.web10.11471.1720800196894726960 for ; Fri, 12 Jul 2024 09:03:17 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=G4H8EJ8c; spf=pass (domain: gmail.com, ip: 209.85.161.44, mailfrom: jpewhacker@gmail.com) Received: by mail-oo1-f44.google.com with SMTP id 006d021491bc7-5b53bb4bebaso1106880eaf.0 for ; Fri, 12 Jul 2024 09:03:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800195; x=1721404995; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=CZBWMwwd9l76frSVWBofn7BnkyRG+ITkf3ToQYJlNsg=; b=G4H8EJ8ce14OQOGz0GNJBZU0zGJcp57TPfR7kv0p1bhjIdrXDXkPyt/f4o4oFmZcG1 NrGQLewk8NvfWSM6HcefQof3fpMSkylXVEIVVE3O4uHXMBMJ/9KIsOwS5G7bBeUWGSru 0SqO5pe8UWt9C3uUnBVM0pD65/f3RgI+9WqD+/eB36zKkjvP6Qq3l0EDQ67n+MIVnGK8 N0O0P5FcFLfi5GdFV3ZrpJlC3RBiT/gGSzp9Lkk0y35dHvr6XPcBLsUJxfqIDYsVXp0Z HalKGKPRRBneWPd9rUpSHJrgunVBaZUeao/36Riz0zAuZkw+6QYTI5Ze8GP7/dD0Fpev 22Uw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800195; x=1721404995; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=CZBWMwwd9l76frSVWBofn7BnkyRG+ITkf3ToQYJlNsg=; b=QuW3gz07RbnLCpAiuyR5azORo7MCi3VFbAgtmDid5BhhPZwmFK/b5wgD5uD4TH/Vsu vV1SoPf9hliUSfG5ahBC/wFNtiwibBLfGO6T4A6BuU71gO0qCF7OiUsy1Ph4IX5kFcv5 mk8HWfrNyBa3iWABlg2Yx2806FY8uOUULUg7Hd084KcTHCpSelgDkgctlkRyyQAO7Mrm qG6DojoylotP6eOb7zb9Qor/WRiTWxqsMsqUYHu9FsLXgTW1BjyFZCHjF87F/fAy2+SL PN1pE0gAVipWuEAce10hiLdTkyEa1r/U+toXkLPJLy1UQhn3YSRorlTNulmowNxOK4r3 j5SA== X-Gm-Message-State: AOJu0Yy1bkmkgMIzE7sv2GPK7arxSBgpPTipTfEfDqwo/D13UhbuGPdk FL7rMLtK+lWGFUuMwV6vJ8Qi/rJCuV0HtxKv4F04jFmO+RN4xdvqg/ZQjQ== X-Google-Smtp-Source: AGHT+IEs8Y9DastxkCWHDRrtgHmjEylnEe/jXanfZGGp+CvDcnZqisyxrPPfTmUZ8B/2+ZOPnT4KrA== X-Received: by 2002:a05:6870:2a4e:b0:24f:c715:a4d4 with SMTP id 586e51a60fabf-25eaebdb167mr10058517fac.40.1720800194857; Fri, 12 Jul 2024 09:03:14 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.13 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:14 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 04/12] classes-global/staging: Exclude do_create_spdx from automatic sysroot extension Date: Fri, 12 Jul 2024 09:58:14 -0600 Message-ID: <20240712160304.3514496-5-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:23 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201838 do_create_spdx is a outlier in that it doesn't need the RSS to be extended just because it depends on do_populate_sysroot. In fact, it only depends on do_populate_sysroot so it can see the actual recipes sysroot, and attempting to extend the sysroot can cause problems for some recipes (e.g. if a recipe does do_populate_sysroot[noexec] = "1") As such, explicitly exclude do_create_spdx from extending the sysroot just because it depends on do_populate_sysroot. Signed-off-by: Joshua Watt --- meta/classes-global/staging.bbclass | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/meta/classes-global/staging.bbclass b/meta/classes-global/staging.bbclass index 3678a1b4415..c2213ffa2b4 100644 --- a/meta/classes-global/staging.bbclass +++ b/meta/classes-global/staging.bbclass @@ -652,10 +652,17 @@ python do_prepare_recipe_sysroot () { addtask do_prepare_recipe_sysroot before do_configure after do_fetch python staging_taskhandler() { + EXCLUDED_TASKS = ( + "do_prepare_recipe_sysroot", + "do_create_spdx", + ) bbtasks = e.tasklist for task in bbtasks: + if task in EXCLUDED_TASKS: + continue + deps = d.getVarFlag(task, "depends") - if task != 'do_prepare_recipe_sysroot' and (task == "do_configure" or (deps and "populate_sysroot" in deps)): + if task == "do_configure" or (deps and "populate_sysroot" in deps): d.prependVarFlag(task, "prefuncs", "extend_recipe_sysroot ") } staging_taskhandler[eventmask] = "bb.event.RecipeTaskPreProcess" From patchwork Fri Jul 12 15:58:15 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46264 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id DC15AC2BD09 for ; Fri, 12 Jul 2024 16:03:22 +0000 (UTC) Received: from mail-oa1-f54.google.com (mail-oa1-f54.google.com [209.85.160.54]) by mx.groups.io with SMTP id smtpd.web10.11472.1720800198477815924 for ; Fri, 12 Jul 2024 09:03:18 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=KhCwMLNF; spf=pass (domain: gmail.com, ip: 209.85.160.54, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f54.google.com with SMTP id 586e51a60fabf-25e076f79d5so995253fac.2 for ; Fri, 12 Jul 2024 09:03:18 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800197; x=1721404997; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=jtgEKVQ7DIhST8KVqQTuZZAMLu1C/Eep2QYIJd2wEjE=; b=KhCwMLNFNhI0m15T5gg6c/sL9niYFPog5eMEev4+wwPTWmP7XzaE0KRtbaHGU7x47y hEc8Y1hLlYIi6qcQN0XUdtRhuKMtQq9AR1Ea+WJtgRWA/lpipIFMh76r24WK45oTDD0r QOEP4B3QG6BaZB66G0cRWUWSFzzbU2Dm8xSWuU2FdM9KXFjJeyMuxacP215PVU3kVxHi wym5WCpwR2/q3fSIsAYE4vCp0+ALKxZd+9mOetFknhihvm4IEHHwtZY6jmWXIRhAHBVg TiZYQ7th2j50tfDQ9hSkPuP/c7F2H+2eN/3mYyTwZa+G+L5kxeDoATExjxwnNzImLRLD uuFw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800197; x=1721404997; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=jtgEKVQ7DIhST8KVqQTuZZAMLu1C/Eep2QYIJd2wEjE=; b=tNvCTMSh1zAV9oyBmghpfWOoVAMr2DshJSzScWORRuWkxPQbQQvwn0xsYOkxIEnBq5 8Lr56tbQTlb49IIEIMLVeQQaGBSoAQbghGl0HDbIrEFRKLvtBBNE9vWRCFtxPDa9A+5b SsyHaAr1dKyAO3cWnh4UJ6ZiViqUo3ppzeoQlQUS+kkFP796g9sbWPyy8zYjGAJ4rApI Cj+H1aXv2amAVOPxQQtJhuhcU89KPuwSypmrAPfnMd8zOC5nHZXXU1AmVvxdTRl+uP10 8VodfqeUr2Kz84B0lfrTYL8+q2GItrGxO1IoXdEMeGpTk76pl239T4XBMXGIzIvZ0Ma5 h1uA== X-Gm-Message-State: AOJu0YxhkLneBdyj87aMONq3/D7Gpgl9TwE6jsEnu2GW8b9evHiI3kYt XwoQJclNDZKCqI8ChJufBs2jpUE2FeCmmvTNw73GPQTmmeRwoIx1/0uSmg== X-Google-Smtp-Source: AGHT+IFsQxLQxBB1u2cdHtQ7SWWq5A9uffCsLsj1KgPOouUITCGxgpkMu4otHRpy9wiSRTtfLviBUQ== X-Received: by 2002:a05:6871:408e:b0:25e:1976:37b6 with SMTP id 586e51a60fabf-25eaebdb53emr11336728fac.39.1720800197059; Fri, 12 Jul 2024 09:03:17 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.15 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:15 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 05/12] classes-recipe/image_types: Add SPDX_IMAGE_PURPOSE to images Date: Fri, 12 Jul 2024 09:58:15 -0600 Message-ID: <20240712160304.3514496-6-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:22 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201839 Adds the variable overrides to set the SPDX image purpose for various image types Signed-off-by: Joshua Watt --- meta/classes-recipe/image_types.bbclass | 2 ++ meta/classes-recipe/image_types_wic.bbclass | 1 + 2 files changed, 3 insertions(+) diff --git a/meta/classes-recipe/image_types.bbclass b/meta/classes-recipe/image_types.bbclass index 2f948ecbf88..506b9934cb7 100644 --- a/meta/classes-recipe/image_types.bbclass +++ b/meta/classes-recipe/image_types.bbclass @@ -146,6 +146,7 @@ IMAGE_CMD:vfat = "oe_mkvfatfs ${EXTRA_IMAGECMD}" IMAGE_CMD_TAR ?= "tar" # ignore return code 1 "file changed as we read it" as other tasks(e.g. do_image_wic) may be hardlinking rootfs IMAGE_CMD:tar = "${IMAGE_CMD_TAR} --sort=name --format=posix --numeric-owner -cf ${IMGDEPLOYDIR}/${IMAGE_NAME}.tar -C ${IMAGE_ROOTFS} . || [ $? -eq 1 ]" +SPDX_IMAGE_PURPOSE:tar = "archive" do_image_cpio[cleandirs] += "${WORKDIR}/cpio_append" IMAGE_CMD:cpio () { @@ -167,6 +168,7 @@ IMAGE_CMD:cpio () { fi fi } +SPDX_IMAGE_PURPOSE:cpio = "archive" UBI_VOLNAME ?= "${MACHINE}-rootfs" UBI_VOLTYPE ?= "dynamic" diff --git a/meta/classes-recipe/image_types_wic.bbclass b/meta/classes-recipe/image_types_wic.bbclass index cf3be909b30..86f40633ebc 100644 --- a/meta/classes-recipe/image_types_wic.bbclass +++ b/meta/classes-recipe/image_types_wic.bbclass @@ -91,6 +91,7 @@ IMAGE_CMD:wic () { mv "$build_wic/$(basename "${wks%.wks}")"*.${IMAGER} "$out.wic" } IMAGE_CMD:wic[vardepsexclude] = "WKS_FULL_PATH WKS_FILES TOPDIR" +SPDX_IMAGE_PURPOSE:wic = "diskImage" do_image_wic[cleandirs] = "${WORKDIR}/build-wic" PSEUDO_IGNORE_PATHS .= ",${WORKDIR}/build-wic" From patchwork Fri Jul 12 15:58:16 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46265 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id F040CC41513 for ; Fri, 12 Jul 2024 16:03:22 +0000 (UTC) Received: from mail-oa1-f44.google.com (mail-oa1-f44.google.com [209.85.160.44]) by mx.groups.io with SMTP id smtpd.web10.11475.1720800200216514242 for ; Fri, 12 Jul 2024 09:03:20 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=WIGaaGD6; spf=pass (domain: gmail.com, ip: 209.85.160.44, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f44.google.com with SMTP id 586e51a60fabf-25e0c0df558so982286fac.0 for ; Fri, 12 Jul 2024 09:03:20 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800199; x=1721404999; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=eYjnaAH+9QgnRkIiZyxDssup1XuZFHF3jTuz5G3k7Kw=; b=WIGaaGD6+SgBbztfipGjni1YD1TReOANLGcszeFUjcqpj54KHhamPpChqSHWzfip0b AxyJAKbgBgdmkZvBeoIIoydc++g5NJDWyyzlcf/TdG22XzM8zUanpqedF/aaJaLZeCZB 0pd4ZPC7vQ2jq3ctFTD19bRjKk/Czz/ZxGkZTI763rr1uckKfDu1akeA5owq89oW6QH2 MXZnkwNmpNpBYMYbN2znsSztcrPkA3Rc0iPAe1xVL1WgZdtYZi7acFlr+bY6OPnP2cTi 4Rhfh3BT6v3LMuUXIMyhxENptzM9g6mHoywjYY+T5ZzDz1hqyjXJAw/z/bEuzpwWfm/E 6DwQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800199; x=1721404999; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=eYjnaAH+9QgnRkIiZyxDssup1XuZFHF3jTuz5G3k7Kw=; b=RTyBA2toGf4sMMjJ9x+mZAqwSIEshvIrlY/IuFhEGmzXRdfeJn1gF/QkcDAbsGseKA 2ur+nYu9fU+U7Td7tpMWjmr9gnIhal9AJuXUaFeW52/pzzgaoS5Ofhlc77gex3S9BYHc O3qBZpZsq1exHvQE9Dya2ZfSIYW4xTd5MQrPnmlryFeh2fsUjA1x7SnsjlQEl35jY5dW c19c46NsTHbeQufmGnmDxLj5HtD3EtSFy6cLhYKRTcv1TAMvkecG2u17bjRPvl9P3RY4 XZ5IoLV87vFxlnf6WAphtDaOdVO5Rh/A9ghF3+xlR6Vz55xdC+/oXMrwaITgev7b+fQG fv/Q== X-Gm-Message-State: AOJu0YyLEu8/BSPnr5pA59Sbbd7cAiK7NYyc3wErtpDqC//HHw8oKp8D eqy7E5mYIl6fzYTxdcWh9VI507jqqxOyffOvxwFuNf5EdXBq7l6dGq/sow== X-Google-Smtp-Source: AGHT+IE3L77oaLBtBmae1iLmuKlaPj8Rzlk0NijMXVbLodMbyFkoPXud0tst+G+NELCC+a9vemOjdg== X-Received: by 2002:a05:6871:5c9:b0:25e:128e:609f with SMTP id 586e51a60fabf-25eae822e35mr10019308fac.28.1720800198658; Fri, 12 Jul 2024 09:03:18 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.17 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:17 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 06/12] selftest: spdx: Add SPDX 3.0 test cases Date: Fri, 12 Jul 2024 09:58:16 -0600 Message-ID: <20240712160304.3514496-7-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:22 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201840 Adds test cases for SPDX 3.0. Reworks the SPDX 2.2 test setup so it can also be run even if the default is SPDX 3.0 Signed-off-by: Joshua Watt --- meta/lib/oeqa/selftest/cases/spdx.py | 133 +++++++++++++++++++++++++-- 1 file changed, 124 insertions(+), 9 deletions(-) diff --git a/meta/lib/oeqa/selftest/cases/spdx.py b/meta/lib/oeqa/selftest/cases/spdx.py index 7685a81e7fb..be595babb35 100644 --- a/meta/lib/oeqa/selftest/cases/spdx.py +++ b/meta/lib/oeqa/selftest/cases/spdx.py @@ -6,21 +6,26 @@ import json import os +import textwrap +from pathlib import Path from oeqa.selftest.case import OESelftestTestCase -from oeqa.utils.commands import bitbake, get_bb_var, runCmd +from oeqa.utils.commands import bitbake, get_bb_var, get_bb_vars, runCmd -class SPDXCheck(OESelftestTestCase): +class SPDX22Check(OESelftestTestCase): @classmethod def setUpClass(cls): - super(SPDXCheck, cls).setUpClass() + super().setUpClass() bitbake("python3-spdx-tools-native") bitbake("-c addto_recipe_sysroot python3-spdx-tools-native") def check_recipe_spdx(self, high_level_dir, spdx_file, target_name): - config = """ -INHERIT += "create-spdx" -""" + config = textwrap.dedent( + """\ + INHERIT:remove = "create-spdx" + INHERIT += "create-spdx-2.2" + """ + ) self.write_config(config) deploy_dir = get_bb_var("DEPLOY_DIR") @@ -29,7 +34,9 @@ INHERIT += "create-spdx" # qemux86-64 creates the directory qemux86_64 machine_dir = machine_var.replace("-", "_") - full_file_path = os.path.join(deploy_dir, "spdx", spdx_version, machine_dir, high_level_dir, spdx_file) + full_file_path = os.path.join( + deploy_dir, "spdx", spdx_version, machine_dir, high_level_dir, spdx_file + ) try: os.remove(full_file_path) @@ -44,8 +51,13 @@ INHERIT += "create-spdx" self.assertNotEqual(report, None) self.assertNotEqual(report["SPDXID"], None) - python = os.path.join(get_bb_var('STAGING_BINDIR', 'python3-spdx-tools-native'), 'nativepython3') - validator = os.path.join(get_bb_var('STAGING_BINDIR', 'python3-spdx-tools-native'), 'pyspdxtools') + python = os.path.join( + get_bb_var("STAGING_BINDIR", "python3-spdx-tools-native"), + "nativepython3", + ) + validator = os.path.join( + get_bb_var("STAGING_BINDIR", "python3-spdx-tools-native"), "pyspdxtools" + ) result = runCmd("{} {} -i {}".format(python, validator, filename)) self.assertExists(full_file_path) @@ -53,3 +65,106 @@ INHERIT += "create-spdx" def test_spdx_base_files(self): self.check_recipe_spdx("packages", "base-files.spdx.json", "base-files") + + +class SPDX3CheckBase(object): + """ + Base class for checking SPDX 3 based tests + """ + + def check_spdx_file(self, filename): + import oe.spdx30 + + self.assertExists(filename) + + # Read the file + objset = oe.spdx30.SHACLObjectSet() + with open(filename, "r") as f: + d = oe.spdx30.JSONLDDeserializer() + d.read(f, objset) + + return objset + + def check_recipe_spdx(self, target_name, spdx_path, *, task=None, extraconf=""): + config = textwrap.dedent( + f"""\ + INHERIT:remove = "create-spdx" + INHERIT += "{self.SPDX_CLASS}" + {extraconf} + """ + ) + self.write_config(config) + + if task: + bitbake(f"-c {task} {target_name}") + else: + bitbake(target_name) + + filename = spdx_path.format( + **get_bb_vars( + [ + "DEPLOY_DIR_IMAGE", + "DEPLOY_DIR_SPDX", + "MACHINE", + "MACHINE_ARCH", + "SDKMACHINE", + "SDK_DEPLOY", + "SPDX_VERSION", + "TOOLCHAIN_OUTPUTNAME", + ], + target_name, + ) + ) + + return self.check_spdx_file(filename) + + def check_objset_missing_ids(self, objset): + if objset.missing_ids: + self.assertTrue( + False, + "The following SPDXIDs are unresolved:\n " + + "\n ".join(objset.missing_ids), + ) + + +class SPDX30Check(SPDX3CheckBase, OESelftestTestCase): + SPDX_CLASS = "create-spdx-3.0" + + def test_base_files(self): + self.check_recipe_spdx( + "base-files", + "{DEPLOY_DIR_SPDX}/{MACHINE_ARCH}/packages/base-files.spdx.json", + ) + + def test_core_image_minimal(self): + objset = self.check_recipe_spdx( + "core-image-minimal", + "{DEPLOY_DIR_IMAGE}/core-image-minimal-{MACHINE}.rootfs.spdx.json", + ) + + # Document should be fully linked + self.check_objset_missing_ids(objset) + + def test_core_image_minimal_sdk(self): + objset = self.check_recipe_spdx( + "core-image-minimal", + "{SDK_DEPLOY}/{TOOLCHAIN_OUTPUTNAME}.spdx.json", + task="populate_sdk", + ) + + # Document should be fully linked + self.check_objset_missing_ids(objset) + + def test_baremetal_helloworld(self): + objset = self.check_recipe_spdx( + "baremetal-helloworld", + "{DEPLOY_DIR_IMAGE}/baremetal-helloworld-image-{MACHINE}.spdx.json", + extraconf=textwrap.dedent( + """\ + TCLIBC = "baremetal" + """ + ), + ) + + # Document should be fully linked + self.check_objset_missing_ids(objset) From patchwork Fri Jul 12 15:58:17 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46266 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 0774BC3DA45 for ; Fri, 12 Jul 2024 16:03:23 +0000 (UTC) Received: from mail-oo1-f45.google.com (mail-oo1-f45.google.com [209.85.161.45]) by mx.groups.io with SMTP id smtpd.web10.11476.1720800201801790492 for ; Fri, 12 Jul 2024 09:03:21 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=OAYd3Cob; spf=pass (domain: gmail.com, ip: 209.85.161.45, mailfrom: jpewhacker@gmail.com) Received: by mail-oo1-f45.google.com with SMTP id 006d021491bc7-5c690949977so1112369eaf.1 for ; Fri, 12 Jul 2024 09:03:21 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800200; x=1721405000; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=eH/eCSiti6j3SN/dayvJvxY3TEZG94Qfnju8TXLySzw=; b=OAYd3CobrF1dgzbk6yswic4Coi8YbWB81fAJ6GzXKbl6ZzXC4a1WYC+ZiintOh2DuQ dvxvjT5FbAKb49lo2DdIQDe0W9eSyxIfgxVTKcSCsdBB27JhE/WgSkU9AD48zmI0kt2K sb5WzVFFmMbImjohlVGqPxiNcv0XNysigKPIvsHkUhQTv7ZjjH3dlqyORJVvR2V5IPjv ZKejHFP1nMa5RX87h52QdAIN511ONccHLbFYWFwGusESR2uAAOmGJdPPBkyxl8ACxnv3 K6sA1hIjtGfErV57btOLC6o0Z3IW/SuQHK89/VckBzFUPWYx42mW0Vg06mSyiJiBp6zr mAsQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800200; x=1721405000; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=eH/eCSiti6j3SN/dayvJvxY3TEZG94Qfnju8TXLySzw=; b=CIl/OCTsJFtPho5d9/q66ldekMi3GqDd9uFE1EY2/mRv3vZP00pGDF475tQHaf+V9Q Td4gKpCnuYrE/DALRB40Y+AqOZFaajDFR1cP7TgHXUoc4/PPJDrHhV9rDs9Pg3Gc6yab UvJrNgpT6NIzjAuehXRf+SqSC8cQtWt67ZVhUWQAD08FccK7BdRa92NMq402v3YQow5+ JTw3AieO3JpyqvgDORK6i3Ti6iVQK0DKv3tNKa9SIX45dOdB4+n1Wew8lS2r3DyIqXao 1GG7NIg/8HR8AWubIeGvFNSAXQ7DvqYFFzTps2fsf88Zw1tM/mFUkDVB2zlpIk9wcZY1 GVOg== X-Gm-Message-State: AOJu0YxvtB+Yla5ypQJUNCKGaIipnfFbrsJ+PakZtV20vYMCMDL5JOvW 5e3RdW6aduxTSHHRxN6S1R1HV8xo0UT4bqySEiJxYfGwhhugDZ7TKck/+A== X-Google-Smtp-Source: AGHT+IH04zKum3I7gf6HpO02xvZoEMUJg5Zx/uRwcnU1R4SlmYXmuSyRlsw0l0W2ik46NB7/r6tO7Q== X-Received: by 2002:a05:6870:c1cb:b0:25d:f654:9cd6 with SMTP id 586e51a60fabf-25eaec168d6mr9816084fac.38.1720800200491; Fri, 12 Jul 2024 09:03:20 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.18 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:19 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 07/12] classes-recipe: nospdx: Add class Date: Fri, 12 Jul 2024 09:58:17 -0600 Message-ID: <20240712160304.3514496-8-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:23 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201841 Adds a class that allows recipes to opt out of generating SPDX Signed-off-by: Joshua Watt --- meta/classes-recipe/nospdx.bbclass | 13 +++++++++++++ meta/recipes-core/meta/build-sysroots.bb | 5 +---- meta/recipes-core/meta/meta-world-pkgdata.bb | 3 +-- 3 files changed, 15 insertions(+), 6 deletions(-) create mode 100644 meta/classes-recipe/nospdx.bbclass diff --git a/meta/classes-recipe/nospdx.bbclass b/meta/classes-recipe/nospdx.bbclass new file mode 100644 index 00000000000..b20e28218be --- /dev/null +++ b/meta/classes-recipe/nospdx.bbclass @@ -0,0 +1,13 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: MIT +# + +deltask do_collect_spdx_deps +deltask do_create_spdx +deltask do_create_spdx_runtime +deltask do_create_package_spdx +deltask do_create_rootfs_spdx +deltask do_create_image_spdx +deltask do_create_image_sbom diff --git a/meta/recipes-core/meta/build-sysroots.bb b/meta/recipes-core/meta/build-sysroots.bb index db05c111ab2..b0b8fb3c79a 100644 --- a/meta/recipes-core/meta/build-sysroots.bb +++ b/meta/recipes-core/meta/build-sysroots.bb @@ -7,7 +7,7 @@ STANDALONE_SYSROOT_NATIVE = "${STAGING_DIR}/${BUILD_ARCH}" PACKAGE_ARCH = "${MACHINE_ARCH}" EXCLUDE_FROM_WORLD = "1" -inherit nopackages +inherit nopackages nospdx deltask fetch deltask unpack deltask patch @@ -17,9 +17,6 @@ deltask configure deltask compile deltask install deltask populate_sysroot -deltask create_spdx -deltask collect_spdx_deps -deltask create_runtime_spdx deltask recipe_qa do_build_warn () { diff --git a/meta/recipes-core/meta/meta-world-pkgdata.bb b/meta/recipes-core/meta/meta-world-pkgdata.bb index 0438bf61387..244175ddd44 100644 --- a/meta/recipes-core/meta/meta-world-pkgdata.bb +++ b/meta/recipes-core/meta/meta-world-pkgdata.bb @@ -27,14 +27,13 @@ python do_collect_packagedata() { oe.copy_buildsystem.generate_locked_sigs(sigfile, d) } +inherit nospdx deltask do_fetch deltask do_unpack deltask do_patch deltask do_configure deltask do_compile deltask do_install -deltask do_create_spdx -deltask do_create_spdx_runtime do_prepare_recipe_sysroot[deptask] = "" From patchwork Fri Jul 12 15:58:18 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46268 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 0BC67C2BD09 for ; Fri, 12 Jul 2024 16:03:33 +0000 (UTC) Received: from mail-oa1-f52.google.com (mail-oa1-f52.google.com [209.85.160.52]) by mx.groups.io with SMTP id smtpd.web10.11478.1720800203551896646 for ; Fri, 12 Jul 2024 09:03:23 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=KqlYND+6; spf=pass (domain: gmail.com, ip: 209.85.160.52, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f52.google.com with SMTP id 586e51a60fabf-25e400d78b0so848980fac.2 for ; Fri, 12 Jul 2024 09:03:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800202; x=1721405002; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=+VYdiuKrDKrBqkpOiI+JNPgOWytHmfR9UpW3NdsbrnU=; b=KqlYND+6WjJ3gtdRgC5CbRWie+7g+Iai0n7T4uQEwiu2VloxdHHw6U2uuEERGAyMIo y6q6xrL/v8IdoCuasoOALZsiOyt/xh8LLCqm8Iu7Ng6XkToX1JXQzl0/rf/aBixOXQm+ fG5EfLGt7wWcSwDL/qvdZbWsov6P111NL+SME/rnvB4BAYj7XXC2TSYqh8p1M54XWcZg MPDpW5JauKhrbCx2iCe7J9HlmNj7fXYs2dCOYJBMTC96vluU+h50pJOOoYMxLiQxu2ge 4CYnv9q+TvKnqq3wP/xjIMAKBErRN5KRe8KiZjO2+Vzxueevp/6QDb2Dhe9RsM/1CSD1 XZbA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800202; x=1721405002; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=+VYdiuKrDKrBqkpOiI+JNPgOWytHmfR9UpW3NdsbrnU=; b=fMXu/lJtAOTQK3dW7Tj3Beb8jVTQt/zVlCobS/ZV0HEvgSFP/2Cl+IOt/hNCVciYrk XALkc9+MEd8BDF/OV+jciducrOrY399BXBG38+k1yIQxI2Ar1pFzDf5Ix/hzlXKPXlHF LkysmEDRWfBiUDmRDGVn1FZTrW8OSgR8sUDp41L54xAq9/j8lI5WkHgW20VXTODoitdy mH1kweMyWLyOPnPBCPbWf99lMkPW7SQIUFx2vB0ZACHJRoc4fJAtSbVcErunbJDAig6C YeRb6ZxkJJ2ngG03sDL/ywUiG3LVW/Qbyfd9FjUWHZXA/nDm8tHoWJKMY/lyjY8RKNv8 2XZw== X-Gm-Message-State: AOJu0YxSOMdqqlEyba7jUmho30eoHkF1zQ18UX55yiju+Cf+4BxsGeKK RwpyZcrQdIGFHrn5nMV/5qjBl7ZfWdl1csLrVjO9oxHhHstN5P9P5P+mwQ== X-Google-Smtp-Source: AGHT+IG69cuxBmpwyQrevepLflPjHvxBJ88TMGoOLVFSFnUgBeoD7Ix2M6xX1H+nozKImK7FvNmqBw== X-Received: by 2002:a05:6870:55ce:b0:250:756b:b1ed with SMTP id 586e51a60fabf-25eae806599mr10068761fac.19.1720800202062; Fri, 12 Jul 2024 09:03:22 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.20 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:21 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 08/12] selftest: sstatetests: Exclude all SPDX tasks Date: Fri, 12 Jul 2024 09:58:18 -0600 Message-ID: <20240712160304.3514496-9-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:33 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201842 SPDX 3.0 introduces a bunch of new SPDX tasks. Instead of explicitly enumerating them all, modify the regex to match 'create_.*spdx' which will cover all of the SPDX 2.0 and SPDX 3.0 tasks Signed-off-by: Joshua Watt --- meta/lib/oeqa/selftest/cases/sstatetests.py | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/meta/lib/oeqa/selftest/cases/sstatetests.py b/meta/lib/oeqa/selftest/cases/sstatetests.py index 94ad6e38b68..0153ef37cb6 100644 --- a/meta/lib/oeqa/selftest/cases/sstatetests.py +++ b/meta/lib/oeqa/selftest/cases/sstatetests.py @@ -933,8 +933,7 @@ class SStateCheckObjectPresence(SStateBase): # these get influnced by IMAGE_FSTYPES tweaks in yocto-autobuilder-helper's config.json (on x86-64) # additionally, they depend on noexec (thus, absent stamps) package, install, etc. image tasks, # which makes tracing other changes difficult - exceptions += ["{}.*create_spdx".format(t) for t in targets.split()] - exceptions += ["{}.*create_runtime_spdx".format(t) for t in targets.split()] + exceptions += ["{}.*create_.*spdx".format(t) for t in targets.split()] output_l = output.splitlines() for l in output_l: From patchwork Fri Jul 12 15:58:19 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46271 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 608B3C3DA52 for ; Fri, 12 Jul 2024 16:03:33 +0000 (UTC) Received: from mail-oa1-f44.google.com (mail-oa1-f44.google.com [209.85.160.44]) by mx.groups.io with SMTP id smtpd.web10.11480.1720800205762156326 for ; Fri, 12 Jul 2024 09:03:25 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=Q3K/l2Jl; spf=pass (domain: gmail.com, ip: 209.85.160.44, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f44.google.com with SMTP id 586e51a60fabf-260209df55dso1030097fac.2 for ; Fri, 12 Jul 2024 09:03:25 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800204; x=1721405004; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=QWoDgZ1K0QP0HN08F4ewTMnYzS1aixFyRNVKKv3KFq8=; b=Q3K/l2Jl3l488Qze2UZKwkSVZ6nDfKJeTKYkCuGhOAm0ND/v7zZUGJhj71EhnqzrJJ 0oMEsC8QVbGAHbkyZfGF3/eOqxbsxl2IJ3tWBvYYxEu/TDycQtEAVTvqp0pnLq7W3Nc8 FkbL1z3CysA1d1LMPkY8RexNfP/5g0d9ZoNtpqkrMHtBS13iNJmO0YFq5EsvUwzv02BC Bops+8Bm9BsjMYek37gW9nIaf/gkLAyU7/98koyEDbIFsDBYMxm1r7mo4vSwNn2awn1v 8VtI+6BHrFGL2XecbUOEjvcZ4sK4OW8bFZD7H2i0yOPTUbyKapOm/u3a56llUqOa9SIZ YjBQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800204; x=1721405004; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=QWoDgZ1K0QP0HN08F4ewTMnYzS1aixFyRNVKKv3KFq8=; b=HUclYHqEvVwe+50b8OXFCqXI06IMMYK5jvwK05ebbOf2PMGlR1Ey3NgAewIP/38JLa 036R0FxI1Mbpj7hhm/5PC+FGZ67B3StVEQHEiI4F7qL10WZtd/guM8aRTdhXtN7MGT/4 kLMlNSfbCX7sp2X09An7n0ydXreP6zAjFn7+YokQ5aC3N8MKIYQc3BQS6FVo0sTo0QqS Lxkd3SsPb6skq03rnaU88rUUI0E7PsvqlTPn+V4cBDw7nYbmWkR4gari+Dj5RvQqgXp0 QmfSMUfujBinc6uzvTzlwXyGygHMF1SuJFrYaTUE9az98rfflBUg84UIAHuBt+gh9MT2 vA9w== X-Gm-Message-State: AOJu0Yyx9p6UVc25cI3XnqBFnSfL3cKqB3pA9ASjzJWHkykn854Aeutx EZBwt+aNrVEK/0C74R2lBBDNB6n2t9nSwS9e7XXF27zHIpdjo2lY5UidmA== X-Google-Smtp-Source: AGHT+IFQn63r9s23JT88obt4vARsaC8TtB9h++NQeDa/n1v/ayTASeRG9GYFtVQ8viNBNUk2fIsBKQ== X-Received: by 2002:a05:6870:7083:b0:258:3455:4b37 with SMTP id 586e51a60fabf-25eaeca2647mr9915166fac.59.1720800203664; Fri, 12 Jul 2024 09:03:23 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.22 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:22 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 09/12] classes/spdx-common: Move to library Date: Fri, 12 Jul 2024 09:58:19 -0600 Message-ID: <20240712160304.3514496-10-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:33 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201843 Moves the bulk of the code in the spdx-common bbclass into library code Signed-off-by: Joshua Watt --- meta/classes/create-spdx-2.2.bbclass | 23 ++- meta/classes/create-spdx-3.0.bbclass | 22 +- meta/classes/create-spdx-image-3.0.bbclass | 3 +- meta/classes/spdx-common.bbclass | 197 +----------------- meta/lib/oe/sbom30.py | 21 +- meta/lib/oe/spdx_common.py | 228 +++++++++++++++++++++ 6 files changed, 270 insertions(+), 224 deletions(-) create mode 100644 meta/lib/oe/spdx_common.py diff --git a/meta/classes/create-spdx-2.2.bbclass b/meta/classes/create-spdx-2.2.bbclass index 3bcde1acc84..0382e4cc51a 100644 --- a/meta/classes/create-spdx-2.2.bbclass +++ b/meta/classes/create-spdx-2.2.bbclass @@ -38,6 +38,12 @@ def recipe_spdx_is_native(d, recipe): a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and a.comment == "isNative" for a in recipe.annotations) +def get_json_indent(d): + if d.getVar("SPDX_PRETTY") == "1": + return 2 + return None + + def convert_license_to_spdx(lic, document, d, existing={}): from pathlib import Path import oe.spdx @@ -113,6 +119,7 @@ def convert_license_to_spdx(lic, document, d, existing={}): def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): from pathlib import Path import oe.spdx + import oe.spdx_common import hashlib source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") @@ -165,7 +172,7 @@ def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archiv )) if "SOURCE" in spdx_file.fileTypes: - extracted_lics = extract_licenses(filepath) + extracted_lics = oe.spdx_common.extract_licenses(filepath) if extracted_lics: spdx_file.licenseInfoInFiles = extracted_lics @@ -256,6 +263,7 @@ def collect_dep_recipes(d, doc, spdx_recipe): from pathlib import Path import oe.sbom import oe.spdx + import oe.spdx_common deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) package_archs = d.getVar("SSTATE_ARCHS").split() @@ -263,7 +271,7 @@ def collect_dep_recipes(d, doc, spdx_recipe): dep_recipes = [] - deps = get_spdx_deps(d) + deps = oe.spdx_common.get_spdx_deps(d) for dep_pn, dep_hashfn, in_taskhash in deps: # If this dependency is not calculated in the taskhash skip it. @@ -386,6 +394,7 @@ python do_create_spdx() { from datetime import datetime, timezone import oe.sbom import oe.spdx + import oe.spdx_common import uuid from pathlib import Path from contextlib import contextmanager @@ -478,10 +487,10 @@ python do_create_spdx() { add_download_packages(d, doc, recipe) - if process_sources(d) and include_sources: + if oe.spdx_common.process_sources(d) and include_sources: recipe_archive = deploy_dir_spdx / "recipes" / (doc.name + ".tar.zst") with optional_tarfile(recipe_archive, archive_sources) as archive: - spdx_get_src(d) + oe.spdx_common.get_patched_src(d) add_package_files( d, @@ -588,6 +597,7 @@ python do_create_runtime_spdx() { from datetime import datetime, timezone import oe.sbom import oe.spdx + import oe.spdx_common import oe.packagedata from pathlib import Path @@ -597,7 +607,7 @@ python do_create_runtime_spdx() { creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - providers = collect_package_providers(d) + providers = oe.spdx_common.collect_package_providers(d) pkg_arch = d.getVar("SSTATE_PKGARCH") package_archs = d.getVar("SSTATE_ARCHS").split() package_archs.reverse() @@ -778,6 +788,7 @@ def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages, spdx import os import oe.spdx import oe.sbom + import oe.spdx_common import io import json from datetime import timezone, datetime @@ -785,7 +796,7 @@ def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages, spdx import tarfile import bb.compress.zstd - providers = collect_package_providers(d) + providers = oe.spdx_common.collect_package_providers(d) package_archs = d.getVar("SSTATE_ARCHS").split() package_archs.reverse() diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass index 51168e4876c..a930ea81152 100644 --- a/meta/classes/create-spdx-3.0.bbclass +++ b/meta/classes/create-spdx-3.0.bbclass @@ -350,20 +350,21 @@ def collect_dep_objsets(d, build): from pathlib import Path import oe.sbom30 import oe.spdx30 + import oe.spdx_common - deps = get_spdx_deps(d) + deps = oe.spdx_common.get_spdx_deps(d) dep_objsets = [] dep_builds = set() dep_build_spdxids = set() - for dep_pn, _, in_taskhash in deps: - bb.debug(1, "Fetching SPDX for dependency %s" % (dep_pn)) - dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld(d, "recipes", dep_pn, oe.spdx30.build_Build) + for dep in deps: + bb.debug(1, "Fetching SPDX for dependency %s" % (dep.pn)) + dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld(d, "recipes", dep.pn, oe.spdx30.build_Build) # If the dependency is part of the taskhash, return it to be linked # against. Otherwise, it cannot be linked against because this recipe # will not rebuilt if dependency changes - if in_taskhash: + if dep.in_taskhash: dep_objsets.append(dep_objset) # The build _can_ be linked against (by alias) @@ -519,6 +520,7 @@ def set_purposes(d, element, *var_names, force_purposes=[]): python do_create_spdx() { import oe.sbom30 import oe.spdx30 + import oe.spdx_common from pathlib import Path from contextlib import contextmanager import oe.cve_check @@ -593,9 +595,9 @@ python do_create_spdx() { [recipe_spdx_license], ) - if process_sources(d) and include_sources: + if oe.spdx_common.process_sources(d) and include_sources: bb.debug(1, "Adding source files to SPDX") - spdx_get_src(d) + oe.spdx_common.get_patched_src(d) build_inputs |= add_package_files( d, @@ -844,6 +846,7 @@ do_create_spdx[depends] += "${PATCHDEPENDENCY}" python do_create_package_spdx() { import oe.sbom30 import oe.spdx30 + import oe.spdx_common import oe.packagedata from pathlib import Path @@ -851,7 +854,7 @@ python do_create_package_spdx() { deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - providers = collect_package_providers(d) + providers = oe.spdx_common.collect_package_providers(d) pkg_arch = d.getVar("SSTATE_PKGARCH") if not is_native: @@ -957,6 +960,7 @@ do_create_package_spdx[rdeptask] = "do_create_spdx" python spdx30_build_started_handler () { import oe.spdx30 import oe.sbom30 + import oe.spdx_common import os from pathlib import Path from datetime import datetime, timezone @@ -966,7 +970,7 @@ python spdx30_build_started_handler () { d = e.data.createCopy() d.setVar("PN", "bitbake") d.setVar("BB_TASKHASH", "bitbake") - load_spdx_license_data(d) + oe.spdx_common.load_spdx_license_data(d) deploy_dir_spdx = Path(e.data.getVar("DEPLOY_DIR_SPDX")) diff --git a/meta/classes/create-spdx-image-3.0.bbclass b/meta/classes/create-spdx-image-3.0.bbclass index bda11d54d40..467719555d6 100644 --- a/meta/classes/create-spdx-image-3.0.bbclass +++ b/meta/classes/create-spdx-image-3.0.bbclass @@ -10,7 +10,8 @@ SPDXIMAGEDEPLOYDIR = "${SPDXDIR}/image-deploy" SPDXROOTFSDEPLOY = "${SPDXDIR}/rootfs-deploy" def collect_build_package_inputs(d, objset, build, packages): - providers = collect_package_providers(d) + import oe.spdx_common + providers = oe.spdx_common.collect_package_providers(d) build_deps = set() diff --git a/meta/classes/spdx-common.bbclass b/meta/classes/spdx-common.bbclass index 6dfc1fd9e4c..d3110a9bdb0 100644 --- a/meta/classes/spdx-common.bbclass +++ b/meta/classes/spdx-common.bbclass @@ -37,96 +37,11 @@ SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json" SPDX_CUSTOM_ANNOTATION_VARS ??= "" -def extract_licenses(filename): - import re - - lic_regex = re.compile(rb'^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$', re.MULTILINE) - - try: - with open(filename, 'rb') as f: - size = min(15000, os.stat(filename).st_size) - txt = f.read(size) - licenses = re.findall(lic_regex, txt) - if licenses: - ascii_licenses = [lic.decode('ascii') for lic in licenses] - return ascii_licenses - except Exception as e: - bb.warn(f"Exception reading {filename}: {e}") - return [] - -def is_work_shared_spdx(d): - return bb.data.inherits_class('kernel', d) or ('work-shared' in d.getVar('WORKDIR')) - -def get_json_indent(d): - if d.getVar("SPDX_PRETTY") == "1": - return 2 - return None - -def load_spdx_license_data(d): - import json - if d.getVar("SPDX_LICENSE_DATA"): - return - - with open(d.getVar("SPDX_LICENSES"), "r") as f: - data = json.load(f) - # Transform the license array to a dictionary - data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} - d.setVar("SPDX_LICENSE_DATA", data) - python() { - load_spdx_license_data(d) + import oe.spdx_common + oe.spdx_common.load_spdx_license_data(d) } -def process_sources(d): - pn = d.getVar('PN') - assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split() - if pn in assume_provided: - for p in d.getVar("PROVIDES").split(): - if p != pn: - pn = p - break - - # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted, - # so avoid archiving source here. - if pn.startswith('glibc-locale'): - return False - if d.getVar('PN') == "libtool-cross": - return False - if d.getVar('PN') == "libgcc-initial": - return False - if d.getVar('PN') == "shadow-sysroot": - return False - - # We just archive gcc-source for all the gcc related recipes - if d.getVar('BPN') in ['gcc', 'libgcc']: - bb.debug(1, 'spdx: There is bug in scan of %s is, do nothing' % pn) - return False - - return True - -def collect_direct_deps(d, dep_task): - current_task = "do_" + d.getVar("BB_CURRENTTASK") - pn = d.getVar("PN") - - taskdepdata = d.getVar("BB_TASKDEPDATA", False) - - for this_dep in taskdepdata.values(): - if this_dep[0] == pn and this_dep[1] == current_task: - break - else: - bb.fatal(f"Unable to find this {pn}:{current_task} in taskdepdata") - - deps = set() - - for dep_name in this_dep.deps: - dep_data = taskdepdata[dep_name] - if dep_data.taskname == dep_task and dep_data.pn != pn: - deps.add((dep_data.pn, dep_data.hashfn, dep_name in this_dep.taskhash_deps)) - - return sorted(deps) - -collect_direct_deps[vardepsexclude] += "BB_TASKDEPDATA" -collect_direct_deps[vardeps] += "DEPENDS" python do_collect_spdx_deps() { # This task calculates the build time dependencies of the recipe, and is @@ -136,11 +51,12 @@ python do_collect_spdx_deps() { # do_create_spdx reads in the found dependencies when writing the actual # SPDX document import json + import oe.spdx_common from pathlib import Path spdx_deps_file = Path(d.getVar("SPDXDEPS")) - deps = collect_direct_deps(d, "do_create_spdx") + deps = oe.spdx_common.collect_direct_deps(d, "do_create_spdx") with spdx_deps_file.open("w") as f: json.dump(deps, f) @@ -151,104 +67,7 @@ do_collect_spdx_deps[depends] += "${PATCHDEPENDENCY}" do_collect_spdx_deps[deptask] = "do_create_spdx" do_collect_spdx_deps[dirs] = "${SPDXDIR}" -def get_spdx_deps(d): - import json - from pathlib import Path - - spdx_deps_file = Path(d.getVar("SPDXDEPS")) - - with spdx_deps_file.open("r") as f: - return json.load(f) - -def collect_package_providers(d): - from pathlib import Path - import oe.sbom - import oe.spdx - import json - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - - providers = {} - - deps = collect_direct_deps(d, "do_create_spdx") - deps.append((d.getVar("PN"), d.getVar("BB_HASHFILENAME"), True)) - - for dep_pn, dep_hashfn, _ in deps: - localdata = d - recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) - if not recipe_data: - localdata = bb.data.createCopy(d) - localdata.setVar("PKGDATA_DIR", "${PKGDATA_DIR_SDK}") - recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) - - for pkg in recipe_data.get("PACKAGES", "").split(): - - pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, localdata) - rprovides = set(n for n, _ in bb.utils.explode_dep_versions2(pkg_data.get("RPROVIDES", "")).items()) - rprovides.add(pkg) - - if "PKG" in pkg_data: - pkg = pkg_data["PKG"] - rprovides.add(pkg) - - for r in rprovides: - providers[r] = (pkg, dep_hashfn) - - return providers - -collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" - -def spdx_get_src(d): - """ - save patched source of the recipe in SPDX_WORKDIR. - """ - import shutil - spdx_workdir = d.getVar('SPDXWORK') - spdx_sysroot_native = d.getVar('STAGING_DIR_NATIVE') - pn = d.getVar('PN') - - workdir = d.getVar("WORKDIR") - - try: - # The kernel class functions require it to be on work-shared, so we dont change WORKDIR - if not is_work_shared_spdx(d): - # Change the WORKDIR to make do_unpack do_patch run in another dir. - d.setVar('WORKDIR', spdx_workdir) - # Restore the original path to recipe's native sysroot (it's relative to WORKDIR). - d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) - - # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the - # possibly requiring of the following tasks (such as some recipes's - # do_patch required 'B' existed). - bb.utils.mkdirhier(d.getVar('B')) - - bb.build.exec_func('do_unpack', d) - # Copy source of kernel to spdx_workdir - if is_work_shared_spdx(d): - share_src = d.getVar('WORKDIR') - d.setVar('WORKDIR', spdx_workdir) - d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) - src_dir = spdx_workdir + "/" + d.getVar('PN')+ "-" + d.getVar('PV') + "-" + d.getVar('PR') - bb.utils.mkdirhier(src_dir) - if bb.data.inherits_class('kernel',d): - share_src = d.getVar('STAGING_KERNEL_DIR') - cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/" - cmd_copy_shared_res = os.popen(cmd_copy_share).read() - bb.note("cmd_copy_shared_result = " + cmd_copy_shared_res) - - git_path = src_dir + "/.git" - if os.path.exists(git_path): - shutils.rmtree(git_path) - - # Make sure gcc and kernel sources are patched only once - if not (d.getVar('SRC_URI') == "" or is_work_shared_spdx(d)): - bb.build.exec_func('do_patch', d) - - # Some userland has no source. - if not os.path.exists( spdx_workdir ): - bb.utils.mkdirhier(spdx_workdir) - finally: - d.setVar("WORKDIR", workdir) - -spdx_get_src[vardepsexclude] += "STAGING_KERNEL_DIR" - +oe.spdx_common.collect_direct_deps[vardepsexclude] += "BB_TASKDEPDATA" +oe.spdx_common.collect_direct_deps[vardeps] += "DEPENDS" +oe.spdx_common.collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" +oe.spdx_common.get_patched_src[vardepsexclude] += "STAGING_KERNEL_DIR" diff --git a/meta/lib/oe/sbom30.py b/meta/lib/oe/sbom30.py index 771e87be796..2532d19dad2 100644 --- a/meta/lib/oe/sbom30.py +++ b/meta/lib/oe/sbom30.py @@ -12,6 +12,7 @@ import re import hashlib import uuid import os +import oe.spdx_common from datetime import datetime, timezone OE_SPDX_BASE = "https://rdf.openembedded.org/spdx/3.0/" @@ -205,24 +206,6 @@ def get_alias(obj): return None -def extract_licenses(filename): - lic_regex = re.compile( - rb"^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$", re.MULTILINE - ) - - try: - with open(filename, "rb") as f: - size = min(15000, os.stat(filename).st_size) - txt = f.read(size) - licenses = re.findall(lic_regex, txt) - if licenses: - ascii_licenses = [lic.decode("ascii") for lic in licenses] - return ascii_licenses - except Exception as e: - bb.warn(f"Exception reading {filename}: {e}") - return [] - - def to_list(l): if isinstance(l, set): l = sorted(list(l)) @@ -630,7 +613,7 @@ class ObjectSet(oe.spdx30.SHACLObjectSet): return file_licenses = set() - for extracted_lic in extract_licenses(filepath): + for extracted_lic in oe.spdx_common.extract_licenses(filepath): file_licenses.add(self.new_license_expression(extracted_lic)) self.new_relationship( diff --git a/meta/lib/oe/spdx_common.py b/meta/lib/oe/spdx_common.py new file mode 100644 index 00000000000..f23100fe03d --- /dev/null +++ b/meta/lib/oe/spdx_common.py @@ -0,0 +1,228 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +import bb +import collections +import json +import oe.packagedata +import re +import shutil + +from pathlib import Path + + +LIC_REGEX = re.compile( + rb"^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$", + re.MULTILINE, +) + + +def extract_licenses(filename): + """ + Extract SPDX License identifiers from a file + """ + try: + with open(filename, "rb") as f: + size = min(15000, os.stat(filename).st_size) + txt = f.read(size) + licenses = re.findall(LIC_REGEX, txt) + if licenses: + ascii_licenses = [lic.decode("ascii") for lic in licenses] + return ascii_licenses + except Exception as e: + bb.warn(f"Exception reading {filename}: {e}") + return [] + + +def is_work_shared_spdx(d): + return bb.data.inherits_class("kernel", d) or ("work-shared" in d.getVar("WORKDIR")) + + +def load_spdx_license_data(d): + if d.getVar("SPDX_LICENSE_DATA"): + return + + with open(d.getVar("SPDX_LICENSES"), "r") as f: + data = json.load(f) + # Transform the license array to a dictionary + data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} + d.setVar("SPDX_LICENSE_DATA", data) + + +def process_sources(d): + """ + Returns True if the sources for this recipe should be included in the SPDX + or False if not + """ + pn = d.getVar("PN") + assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split() + if pn in assume_provided: + for p in d.getVar("PROVIDES").split(): + if p != pn: + pn = p + break + + # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted, + # so avoid archiving source here. + if pn.startswith("glibc-locale"): + return False + if d.getVar("PN") == "libtool-cross": + return False + if d.getVar("PN") == "libgcc-initial": + return False + if d.getVar("PN") == "shadow-sysroot": + return False + + # We just archive gcc-source for all the gcc related recipes + if d.getVar("BPN") in ["gcc", "libgcc"]: + bb.debug(1, "spdx: There is bug in scan of %s is, do nothing" % pn) + return False + + return True + + +Dep = collections.namedtuple("Dep", ["pn", "hashfn", "in_taskhash"]) + + +def collect_direct_deps(d, dep_task): + """ + Find direct dependencies of current task + + Returns the list of recipes that have a dep_task that the current task + depends on + """ + current_task = "do_" + d.getVar("BB_CURRENTTASK") + pn = d.getVar("PN") + + taskdepdata = d.getVar("BB_TASKDEPDATA", False) + + for this_dep in taskdepdata.values(): + if this_dep[0] == pn and this_dep[1] == current_task: + break + else: + bb.fatal(f"Unable to find this {pn}:{current_task} in taskdepdata") + + deps = set() + + for dep_name in this_dep.deps: + dep_data = taskdepdata[dep_name] + if dep_data.taskname == dep_task and dep_data.pn != pn: + deps.add((dep_data.pn, dep_data.hashfn, dep_name in this_dep.taskhash_deps)) + + return sorted(deps) + + +def get_spdx_deps(d): + """ + Reads the SPDX dependencies JSON file and returns the data + """ + spdx_deps_file = Path(d.getVar("SPDXDEPS")) + + deps = [] + with spdx_deps_file.open("r") as f: + for d in json.load(f): + deps.append(Dep(*d)) + return deps + + +def collect_package_providers(d): + """ + Returns a dictionary where each RPROVIDES is mapped to the package that + provides it + """ + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + + providers = {} + + deps = collect_direct_deps(d, "do_create_spdx") + deps.append((d.getVar("PN"), d.getVar("BB_HASHFILENAME"), True)) + + for dep_pn, dep_hashfn, _ in deps: + localdata = d + recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) + if not recipe_data: + localdata = bb.data.createCopy(d) + localdata.setVar("PKGDATA_DIR", "${PKGDATA_DIR_SDK}") + recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) + + for pkg in recipe_data.get("PACKAGES", "").split(): + pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, localdata) + rprovides = set( + n + for n, _ in bb.utils.explode_dep_versions2( + pkg_data.get("RPROVIDES", "") + ).items() + ) + rprovides.add(pkg) + + if "PKG" in pkg_data: + pkg = pkg_data["PKG"] + rprovides.add(pkg) + + for r in rprovides: + providers[r] = (pkg, dep_hashfn) + + return providers + + +def get_patched_src(d): + """ + Save patched source of the recipe in SPDX_WORKDIR. + """ + spdx_workdir = d.getVar("SPDXWORK") + spdx_sysroot_native = d.getVar("STAGING_DIR_NATIVE") + pn = d.getVar("PN") + + workdir = d.getVar("WORKDIR") + + try: + # The kernel class functions require it to be on work-shared, so we dont change WORKDIR + if not is_work_shared_spdx(d): + # Change the WORKDIR to make do_unpack do_patch run in another dir. + d.setVar("WORKDIR", spdx_workdir) + # Restore the original path to recipe's native sysroot (it's relative to WORKDIR). + d.setVar("STAGING_DIR_NATIVE", spdx_sysroot_native) + + # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the + # possibly requiring of the following tasks (such as some recipes's + # do_patch required 'B' existed). + bb.utils.mkdirhier(d.getVar("B")) + + bb.build.exec_func("do_unpack", d) + # Copy source of kernel to spdx_workdir + if is_work_shared_spdx(d): + share_src = d.getVar("WORKDIR") + d.setVar("WORKDIR", spdx_workdir) + d.setVar("STAGING_DIR_NATIVE", spdx_sysroot_native) + src_dir = ( + spdx_workdir + + "/" + + d.getVar("PN") + + "-" + + d.getVar("PV") + + "-" + + d.getVar("PR") + ) + bb.utils.mkdirhier(src_dir) + if bb.data.inherits_class("kernel", d): + share_src = d.getVar("STAGING_KERNEL_DIR") + cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/" + cmd_copy_shared_res = os.popen(cmd_copy_share).read() + bb.note("cmd_copy_shared_result = " + cmd_copy_shared_res) + + git_path = src_dir + "/.git" + if os.path.exists(git_path): + shutils.rmtree(git_path) + + # Make sure gcc and kernel sources are patched only once + if not (d.getVar("SRC_URI") == "" or is_work_shared_spdx(d)): + bb.build.exec_func("do_patch", d) + + # Some userland has no source. + if not os.path.exists(spdx_workdir): + bb.utils.mkdirhier(spdx_workdir) + finally: + d.setVar("WORKDIR", workdir) From patchwork Fri Jul 12 15:58:20 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46272 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 2BD7DC3DA45 for ; Fri, 12 Jul 2024 16:03:33 +0000 (UTC) Received: from mail-oa1-f43.google.com (mail-oa1-f43.google.com [209.85.160.43]) by mx.groups.io with SMTP id smtpd.web10.11481.1720800207241396019 for ; Fri, 12 Jul 2024 09:03:27 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=VtncE40c; spf=pass (domain: gmail.com, ip: 209.85.160.43, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f43.google.com with SMTP id 586e51a60fabf-25e134abf00so1049420fac.1 for ; Fri, 12 Jul 2024 09:03:27 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800206; x=1721405006; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=FYRUyKR0RVIzIVCYo6ili7F0Ac1VMFXaVajA5vhpIrM=; b=VtncE40c4VlCtdspCMB8gpvKwTBz0AT8AjAj04De6ZYpxbQQJXdoTWzu6Y18b0h35n OHcexpM25A5KdkXJsKB7wkKttQrLirGKXNhW53oA7KvZzRRQ544mVYdY9UNr4lIqPf5z jPT+N3AJprUnYa3dneCjq+jhy/mBsvfnCIN/v6vtJhJexpOE/KPphJcrkHfgNo94634v ibcyRTTY4jPpz2CnODjs9ljPIsK1JuHHxXe+R3MHoBTxWlP9dLMAU7W7TtKdTjVM8oS1 NvyMAtS5Q5vcWPd7ClvccEtbYfC1kV4Y5aTme5BdiUBLvTPQJ99QAeXYZalZGJsro/s0 EqNA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800206; x=1721405006; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=FYRUyKR0RVIzIVCYo6ili7F0Ac1VMFXaVajA5vhpIrM=; b=VUld+oGf/gCNCHdSb8HesfvMn+NZR6SAhqBHUbE6qof467WAk4a10TbUMLbQCnOA93 t8tj0+34KoQxkQ7XmacKZ65MJ6uRlHxMeUvxd+uG+mgtWovhnRaAWwx61hvDjCI8X245 RegdNENIPLuCNUzO8a/vBu9gkGOUlxiRegc0drTUZwm+gwhx3bDcAD2ysTOfCIThce0V RLPTced15xPG3OdcsHIHWcDkcR+myIJewhAi60XhSltcobZ49JHY87PwmlsNIrJjQDqE FxKGnkh/E+IZp6ja/OurbrGikHYC96E5TyjeMMHRvZYbGFbyjw24BKhj5iIaZliIlSCC v6fQ== X-Gm-Message-State: AOJu0YxVJfi1sfO3zcGChzGLUaWf75OzPSJJc4VvOOdvDJvu6cZh6UoM K/ffTze+7MQlQvlGzB2m8JiltEfbXjcL1MIsQGWeH/ga0JOOIgeyr7M8CQ== X-Google-Smtp-Source: AGHT+IFHddwFOXR9wnmCt2peoorwUlotLPKSucTHnerdMOLWrHiM+eveRWpCmH1R8BaxHPUahjkVMQ== X-Received: by 2002:a05:6870:f723:b0:254:c08d:cb55 with SMTP id 586e51a60fabf-25eae755dd6mr10163661fac.5.1720800204830; Fri, 12 Jul 2024 09:03:24 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.23 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:24 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 10/12] classes/create-spdx-3.0: Move tasks to library Date: Fri, 12 Jul 2024 09:58:20 -0600 Message-ID: <20240712160304.3514496-11-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:33 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201844 Move the bulk of the python code in the SPDX 3.0 classes into a library file Signed-off-by: Joshua Watt --- meta/classes/create-spdx-3.0.bbclass | 874 +------------- meta/classes/create-spdx-image-3.0.bbclass | 307 +---- meta/lib/oe/spdx30_tasks.py | 1229 ++++++++++++++++++++ 3 files changed, 1256 insertions(+), 1154 deletions(-) create mode 100644 meta/lib/oe/spdx30_tasks.py diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass index a930ea81152..41840d9d1a3 100644 --- a/meta/classes/create-spdx-3.0.bbclass +++ b/meta/classes/create-spdx-3.0.bbclass @@ -116,698 +116,15 @@ SPDX_PACKAGE_SUPPLIER[doc] = "The base variable name to describe the Agent who \ IMAGE_CLASSES:append = " create-spdx-image-3.0" -def set_timestamp_now(d, o, prop): - from datetime import datetime, timezone +oe.spdx30_tasks.set_timestamp_now[vardepsexclude] = "SPDX_INCLUDE_TIMESTAMPS" +oe.spdx30_tasks.get_package_sources_from_debug[vardepsexclude] += "STAGING_KERNEL_DIR" +oe.spdx30_tasks.collect_dep_objsets[vardepsexclude] = "SSTATE_ARCHS" - if d.getVar("SPDX_INCLUDE_TIMESTAMPS") == "1": - setattr(o, prop, datetime.now(timezone.utc)) - else: - # Doing this helps to validated that the property actually exists, and - # also that it is not mandatory - delattr(o, prop) - -set_timestamp_now[vardepsexclude] = "SPDX_INCLUDE_TIMESTAMPS" - -def add_license_expression(d, objset, license_expression): - from pathlib import Path - import oe.spdx30 - import oe.sbom30 - - license_data = d.getVar("SPDX_LICENSE_DATA") - simple_license_text = {} - license_text_map = {} - license_ref_idx = 0 - - def add_license_text(name): - nonlocal objset - nonlocal simple_license_text - - if name in simple_license_text: - return simple_license_text[name] - - lic = objset.find_filter( - oe.spdx30.simplelicensing_SimpleLicensingText, - name=name, - ) - - if lic is not None: - simple_license_text[name] = lic - return lic - - lic = objset.add(oe.spdx30.simplelicensing_SimpleLicensingText( - _id=objset.new_spdxid("license-text", name), - creationInfo=objset.doc.creationInfo, - name=name, - )) - simple_license_text[name] = lic - - if name == "PD": - lic.simplelicensing_licenseText = "Software released to the public domain" - return lic - - # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH - for directory in [d.getVar('COMMON_LICENSE_DIR')] + (d.getVar('LICENSE_PATH') or '').split(): - try: - with (Path(directory) / name).open(errors="replace") as f: - lic.simplelicensing_licenseText = f.read() - return lic - - except FileNotFoundError: - pass - - # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set - filename = d.getVarFlag('NO_GENERIC_LICENSE', name) - if filename: - filename = d.expand("${S}/" + filename) - with open(filename, errors="replace") as f: - lic.simplelicensing_licenseText = f.read() - return lic - else: - bb.fatal("Cannot find any text for license %s" % name) - - def convert(l): - nonlocal license_text_map - nonlocal license_ref_idx - - if l == "(" or l == ")": - return l - - if l == "&": - return "AND" - - if l == "|": - return "OR" - - if l == "CLOSED": - return "NONE" - - spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l - if spdx_license in license_data["licenses"]: - return spdx_license - - spdx_license = "LicenseRef-" + l - if spdx_license not in license_text_map: - license_text_map[spdx_license] = add_license_text(l)._id - - return spdx_license - - lic_split = license_expression.replace("(", " ( ").replace(")", " ) ").replace("|", " | ").replace("&", " & ").split() - spdx_license_expression = ' '.join(convert(l) for l in lic_split) - - return objset.new_license_expression(spdx_license_expression, license_text_map) - - -def add_package_files(d, objset, topdir, get_spdxid, get_purposes, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): - from pathlib import Path - import oe.spdx30 - import oe.sbom30 - - source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") - if source_date_epoch: - source_date_epoch = int(source_date_epoch) - - spdx_files = set() - - file_counter = 1 - for subdir, dirs, files in os.walk(topdir): - dirs[:] = [d for d in dirs if d not in ignore_dirs] - if subdir == str(topdir): - dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] - - for file in files: - filepath = Path(subdir) / file - if filepath.is_symlink() or not filepath.is_file(): - continue - - bb.debug(1, "Adding file %s to %s" % (filepath, objset.doc._id)) - - filename = str(filepath.relative_to(topdir)) - file_purposes = get_purposes(filepath) - - spdx_file = objset.new_file( - get_spdxid(file_counter), - filename, - filepath, - purposes=file_purposes, - ) - spdx_files.add(spdx_file) - - if oe.spdx30.software_SoftwarePurpose.source in file_purposes: - objset.scan_declared_licenses(spdx_file, filepath) - - if archive is not None: - with filepath.open("rb") as f: - info = archive.gettarinfo(fileobj=f) - info.name = filename - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - if source_date_epoch is not None and info.mtime > source_date_epoch: - info.mtime = source_date_epoch - - archive.addfile(info, f) - - file_counter += 1 - - return spdx_files - - -def get_package_sources_from_debug(d, package, package_files, sources, source_hash_cache): - from pathlib import Path - import oe.packagedata - - def file_path_match(file_path, pkg_file): - if file_path.lstrip("/") == pkg_file.name.lstrip("/"): - return True - - for e in pkg_file.extension: - if isinstance(e, oe.sbom30.OEFileNameAliasExtension): - for a in e.aliases: - if file_path.lstrip("/") == a.lstrip("/"): - return True - - return False - - debug_search_paths = [ - Path(d.getVar('PKGD')), - Path(d.getVar('STAGING_DIR_TARGET')), - Path(d.getVar('STAGING_DIR_NATIVE')), - Path(d.getVar('STAGING_KERNEL_DIR')), - ] - - pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) - - if pkg_data is None: - return - - dep_source_files = set() - - for file_path, file_data in pkg_data["files_info"].items(): - if not "debugsrc" in file_data: - continue - - if not any(file_path_match(file_path, pkg_file) for pkg_file in package_files): - bb.fatal("No package file found for %s in %s; SPDX found: %s" % (str(file_path), package, - " ".join(p.name for p in package_files))) - continue - - for debugsrc in file_data["debugsrc"]: - for search in debug_search_paths: - if debugsrc.startswith("/usr/src/kernel"): - debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') - else: - debugsrc_path = search / debugsrc.lstrip("/") - - if debugsrc_path in source_hash_cache: - file_sha256 = source_hash_cache[debugsrc_path] - if file_sha256 is None: - continue - else: - if not debugsrc_path.exists(): - source_hash_cache[debugsrc_path] = None - continue - - file_sha256 = bb.utils.sha256_file(debugsrc_path) - source_hash_cache[debugsrc_path] = file_sha256 - - if file_sha256 in sources: - source_file = sources[file_sha256] - dep_source_files.add(source_file) - else: - bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) - break - else: - bb.debug(1, "Debug source %s not found" % debugsrc) - - return dep_source_files - -get_package_sources_from_debug[vardepsexclude] += "STAGING_KERNEL_DIR" - -def collect_dep_objsets(d, build): - import json - from pathlib import Path - import oe.sbom30 - import oe.spdx30 - import oe.spdx_common - - deps = oe.spdx_common.get_spdx_deps(d) - - dep_objsets = [] - dep_builds = set() - - dep_build_spdxids = set() - for dep in deps: - bb.debug(1, "Fetching SPDX for dependency %s" % (dep.pn)) - dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld(d, "recipes", dep.pn, oe.spdx30.build_Build) - # If the dependency is part of the taskhash, return it to be linked - # against. Otherwise, it cannot be linked against because this recipe - # will not rebuilt if dependency changes - if dep.in_taskhash: - dep_objsets.append(dep_objset) - - # The build _can_ be linked against (by alias) - dep_builds.add(dep_build) - - return dep_objsets, dep_builds - -collect_dep_objsets[vardepsexclude] = "SSTATE_ARCHS" - -def collect_dep_sources(dep_objsets): - import oe.spdx30 - import oe.sbom30 - - sources = {} - for objset in dep_objsets: - # Don't collect sources from native recipes as they - # match non-native sources also. - if objset.is_native(): - continue - - bb.debug(1, "Fetching Sources for dependency %s" % (objset.doc.name)) - - dep_build = objset.find_root(oe.spdx30.build_Build) - if not dep_build: - bb.fatal("Unable to find a build") - - for e in objset.foreach_type(oe.spdx30.Relationship): - if dep_build is not e.from_: - continue - - if e.relationshipType != oe.spdx30.RelationshipType.hasInputs: - continue - - for to in e.to: - if not isinstance(to, oe.spdx30.software_File): - continue - - if to.software_primaryPurpose != oe.spdx30.software_SoftwarePurpose.source: - continue - - for v in to.verifiedUsing: - if v.algorithm == oe.spdx30.HashAlgorithm.sha256: - sources[v.hashValue] = to - break - else: - bb.fatal("No SHA256 found for %s in %s" % (to.name, objset.doc.name)) - - return sources - -def add_download_files(d, objset): - import oe.patch - import oe.spdx30 - import os - - inputs = set() - - urls = d.getVar("SRC_URI").split() - fetch = bb.fetch2.Fetch(urls, d) - - for download_idx, src_uri in enumerate(urls): - fd = fetch.ud[src_uri] - - for name in fd.names: - file_name = os.path.basename(fetch.localpath(src_uri)) - if oe.patch.patch_path(src_uri, fetch, '', expand=False): - primary_purpose = oe.spdx30.software_SoftwarePurpose.patch - else: - primary_purpose = oe.spdx30.software_SoftwarePurpose.source - - if fd.type == "file": - if os.path.isdir(fd.localpath): - walk_idx = 1 - for root, dirs, files in os.walk(fd.localpath): - for f in files: - f_path = os.path.join(root, f) - if os.path.islink(f_path): - # TODO: SPDX doesn't support symlinks yet - continue - - file = objset.new_file( - objset.new_spdxid("source", str(download_idx + 1), str(walk_idx)), - os.path.join(file_name, os.path.relpath(f_path, fd.localpath)), - f_path, - purposes=[primary_purpose], - ) - - inputs.add(file) - walk_idx += 1 - - else: - file = objset.new_file( - objset.new_spdxid("source", str(download_idx + 1)), - file_name, - fd.localpath, - purposes=[primary_purpose], - ) - inputs.add(file) - - else: - uri = fd.type - proto = getattr(fd, "proto", None) - if proto is not None: - uri = uri + "+" + proto - uri = uri + "://" + fd.host + fd.path - - if fd.method.supports_srcrev(): - uri = uri + "@" + fd.revisions[name] - - dl = objset.add(oe.spdx30.software_Package( - _id=objset.new_spdxid("source", str(download_idx + 1)), - creationInfo=objset.doc.creationInfo, - name=file_name, - software_primaryPurpose=primary_purpose, - software_downloadLocation=uri, - )) - - if fd.method.supports_checksum(fd): - # TODO Need something better than hard coding this - for checksum_id in ["sha256", "sha1"]: - expected_checksum = getattr(fd, "%s_expected" % checksum_id, None) - if expected_checksum is None: - continue - - dl.verifiedUsing.append( - oe.spdx30.Hash( - algorithm=getattr(oe.spdx30.HashAlgorithm, checksum_id), - hashValue=expected_checksum, - ) - ) - - inputs.add(dl) - - return inputs - - -def set_purposes(d, element, *var_names, force_purposes=[]): - purposes = force_purposes[:] - - for var_name in var_names: - val = d.getVar(var_name) - if val: - purposes.extend(val.split()) - break - - if not purposes: - bb.warn("No SPDX purposes found in %s" % " ".join(var_names)) - return - - element.software_primaryPurpose = getattr(oe.spdx30.software_SoftwarePurpose, purposes[0]) - element.software_additionalPurpose = [getattr(oe.spdx30.software_SoftwarePurpose, p) for p in purposes[1:]] python do_create_spdx() { - import oe.sbom30 - import oe.spdx30 - import oe.spdx_common - from pathlib import Path - from contextlib import contextmanager - import oe.cve_check - from datetime import datetime - - def set_var_field(var, obj, name, package=None): - val = None - if package: - val = d.getVar("%s:%s" % (var, package)) - - if not val: - val = d.getVar(var) - - if val: - setattr(obj, name, val) - - deploydir = Path(d.getVar("SPDXDEPLOY")) - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - spdx_workdir = Path(d.getVar("SPDXWORK")) - include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" - pkg_arch = d.getVar("SSTATE_PKGARCH") - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - include_vex = d.getVar("SPDX_INCLUDE_VEX") - if not include_vex in ("none", "current", "all"): - bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") - - build_objset = oe.sbom30.ObjectSet.new_objset(d, d.getVar("PN")) - - build = build_objset.new_task_build("recipe", "recipe") - build_objset.doc.rootElement.append(build) - - build_objset.set_is_native(is_native) - - for var in (d.getVar('SPDX_CUSTOM_ANNOTATION_VARS') or "").split(): - new_annotation( - d, - build_objset, - build, - "%s=%s" % (var, d.getVar(var)), - oe.spdx30.AnnotationType.other - ) - - build_inputs = set() - - # Add CVEs - cve_by_status = {} - if include_vex != "none": - for cve in (d.getVarFlags("CVE_STATUS") or {}): - status, detail, description = oe.cve_check.decode_cve_status(d, cve) - - # If this CVE is fixed upstream, skip it unless all CVEs are - # specified. - if include_vex != "all" and detail in ("fixed-version", "cpe-stable-backport"): - bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) - continue - - cve_by_status.setdefault(status, {})[cve] = ( - build_objset.new_cve_vuln(cve), - detail, - description, - ) - - cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) - - source_files = add_download_files(d, build_objset) - build_inputs |= source_files - - recipe_spdx_license = add_license_expression(d, build_objset, d.getVar("LICENSE")) - build_objset.new_relationship( - source_files, - oe.spdx30.RelationshipType.hasConcludedLicense, - [recipe_spdx_license], - ) - - if oe.spdx_common.process_sources(d) and include_sources: - bb.debug(1, "Adding source files to SPDX") - oe.spdx_common.get_patched_src(d) - - build_inputs |= add_package_files( - d, - build_objset, - spdx_workdir, - lambda file_counter: build_objset.new_spdxid("sourcefile", str(file_counter)), - lambda filepath: [oe.spdx30.software_SoftwarePurpose.source], - ignore_dirs=[".git"], - ignore_top_level_dirs=["temp"], - archive=None, - ) - - - dep_objsets, dep_builds = collect_dep_objsets(d, build) - if dep_builds: - build_objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.dependsOn, - oe.spdx30.LifecycleScopeType.build, - sorted(oe.sbom30.get_element_link_id(b) for b in dep_builds), - ) - - debug_source_ids = set() - source_hash_cache = {} - - # Write out the package SPDX data now. It is not complete as we cannot - # write the runtime data, so write it to a staging area and a later task - # will write out the final collection - - # TODO: Handle native recipe output - if not is_native: - bb.debug(1, "Collecting Dependency sources files") - sources = collect_dep_sources(dep_objsets) - - bb.build.exec_func("read_subpackage_metadata", d) - - pkgdest = Path(d.getVar("PKGDEST")) - for package in d.getVar("PACKAGES").split(): - if not oe.packagedata.packaged(package, d): - continue - - pkg_name = d.getVar("PKG:%s" % package) or package - - bb.debug(1, "Creating SPDX for package %s" % pkg_name) - - pkg_objset = oe.sbom30.ObjectSet.new_objset(d, pkg_name) - - spdx_package = pkg_objset.add_root(oe.spdx30.software_Package( - _id=pkg_objset.new_spdxid("package", pkg_name), - creationInfo=pkg_objset.doc.creationInfo, - name=pkg_name, - software_packageVersion=d.getVar("PV"), - )) - set_timestamp_now(d, spdx_package, "builtTime") - - set_purposes( - d, - spdx_package, - "SPDX_PACKAGE_ADDITIONAL_PURPOSE:%s" % package, - "SPDX_PACKAGE_ADDITIONAL_PURPOSE", - force_purposes=["install"], - ) - - - supplier = build_objset.new_agent("SPDX_PACKAGE_SUPPLIER") - if supplier is not None: - spdx_package.supplier = supplier if isinstance(supplier, str) else supplier._id - - set_var_field("HOMEPAGE", spdx_package, "software_homePage", package=package) - set_var_field("SUMMARY", spdx_package, "summary", package=package) - set_var_field("DESCRIPTION", spdx_package, "description", package=package) - - pkg_objset.new_scoped_relationship( - [build._id], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - [spdx_package], - ) - - for cpe_id in cpe_ids: - spdx_package.externalIdentifier.append( - oe.spdx30.ExternalIdentifier( - externalIdentifierType=oe.spdx30.ExternalIdentifierType.cpe23, - identifier=cpe_id, - )) - - # TODO: Generate a file for each actual IPK/DEB/RPM/TGZ file - # generated and link it to the package - #spdx_package_file = pkg_objset.add(oe.spdx30.software_File( - # _id=pkg_objset.new_spdxid("distribution", pkg_name), - # creationInfo=pkg_objset.doc.creationInfo, - # name=pkg_name, - # software_primaryPurpose=spdx_package.software_primaryPurpose, - # software_additionalPurpose=spdx_package.software_additionalPurpose, - #)) - #set_timestamp_now(d, spdx_package_file, "builtTime") - - ## TODO add hashes - #pkg_objset.new_relationship( - # [spdx_package], - # oe.spdx30.RelationshipType.hasDistributionArtifact, - # [spdx_package_file], - #) - - # NOTE: licenses live in the recipe collection and are referenced - # by ID in the package collection(s). This helps reduce duplication - # (since a lot of packages will have the same license), and also - # prevents duplicate license SPDX IDs in the packages - package_license = d.getVar("LICENSE:%s" % package) - if package_license and package_license != d.getVar("LICENSE"): - package_spdx_license = add_license_expression(d, build_objset, package_license) - else: - package_spdx_license = recipe_spdx_license - - pkg_objset.new_relationship( - [spdx_package], - oe.spdx30.RelationshipType.hasConcludedLicense, - [package_spdx_license._id], - ) - - # NOTE: CVE Elements live in the recipe collection - all_cves = set() - for status, cves in cve_by_status.items(): - for cve, items in cves.items(): - spdx_cve, detail, description = items - - all_cves.add(spdx_cve._id) - - if status == "Patched": - pkg_objset.new_vex_patched_relationship([spdx_cve._id], [spdx_package]) - elif status == "Unpatched": - pkg_objset.new_vex_unpatched_relationship([spdx_cve._id], [spdx_package]) - elif status == "Ignored": - spdx_vex = pkg_objset.new_vex_ignored_relationship( - [spdx_cve._id], - [spdx_package], - impact_statement=description, - ) - - if detail in ("ignored", "cpe-incorrect", "disputed", "upstream-wontfix"): - # VEX doesn't have justifications for this - pass - elif detail in ("not-applicable-config", "not-applicable-platform"): - for v in spdx_vex: - v.security_justificationType = oe.spdx30.security_VexJustificationType.vulnerableCodeNotPresent - else: - bb.fatal(f"Unknown detail '{detail}' for ignored {cve}") - else: - bb.fatal(f"Unknown CVE status {status}") - - if all_cves: - pkg_objset.new_relationship( - [spdx_package], - oe.spdx30.RelationshipType.hasAssociatedVulnerability, - sorted(list(all_cves)), - ) - - bb.debug(1, "Adding package files to SPDX for package %s" % pkg_name) - package_files = add_package_files( - d, - pkg_objset, - pkgdest / package, - lambda file_counter: pkg_objset.new_spdxid("package", pkg_name, "file", str(file_counter)), - # TODO: Can we know the purpose here? - lambda filepath: [], - ignore_top_level_dirs=['CONTROL', 'DEBIAN'], - archive=None, - ) - - if package_files: - pkg_objset.new_relationship( - [spdx_package], - oe.spdx30.RelationshipType.contains, - sorted(list(package_files)), - ) - - if include_sources: - debug_sources = get_package_sources_from_debug(d, package, package_files, sources, source_hash_cache) - debug_source_ids |= set(oe.sbom30.get_element_link_id(d) for d in debug_sources) - - oe.sbom30.write_recipe_jsonld_doc(d, pkg_objset, "packages-staging", deploydir, create_spdx_id_links=False) - - if include_sources: - bb.debug(1, "Adding sysroot files to SPDX") - sysroot_files = add_package_files( - d, - build_objset, - d.expand("${COMPONENTS_DIR}/${PACKAGE_ARCH}/${PN}"), - lambda file_counter: build_objset.new_spdxid("sysroot", str(file_counter)), - lambda filepath: [], - archive=None, - ) - - if sysroot_files: - build_objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - sorted(list(sysroot_files)), - ) - - if build_inputs or debug_source_ids: - build_objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.hasInputs, - oe.spdx30.LifecycleScopeType.build, - sorted(list(build_inputs)) + sorted(list(debug_source_ids)), - ) - - oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "recipes", deploydir) + import oe.spdx30_tasks + oe.spdx30_tasks.create_spdx(d) } do_create_spdx[vardepsexclude] += "BB_NUMBER_THREADS" addtask do_create_spdx after \ @@ -844,101 +161,9 @@ do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}" do_create_spdx[depends] += "${PATCHDEPENDENCY}" python do_create_package_spdx() { - import oe.sbom30 - import oe.spdx30 - import oe.spdx_common - import oe.packagedata - from pathlib import Path - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - - providers = oe.spdx_common.collect_package_providers(d) - pkg_arch = d.getVar("SSTATE_PKGARCH") - - if not is_native: - bb.build.exec_func("read_subpackage_metadata", d) - - dep_package_cache = {} - - # Any element common to all packages that need to be referenced by ID - # should be written into this objset set - common_objset = oe.sbom30.ObjectSet.new_objset(d, "%s-package-common" % d.getVar("PN")) - - pkgdest = Path(d.getVar("PKGDEST")) - for package in d.getVar("PACKAGES").split(): - localdata = bb.data.createCopy(d) - pkg_name = d.getVar("PKG:%s" % package) or package - localdata.setVar("PKG", pkg_name) - localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) - - if not oe.packagedata.packaged(package, localdata): - continue - - spdx_package, pkg_objset = oe.sbom30.load_obj_in_jsonld( - d, - pkg_arch, - "packages-staging", - pkg_name, - oe.spdx30.software_Package, - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, - ) - - # We will write out a new collection, so link it to the new - # creation info in the common package data. The old creation info - # should still exist and be referenced by all the existing elements - # in the package - pkg_objset.creationInfo = pkg_objset.copy_creation_info(common_objset.doc.creationInfo) - - runtime_spdx_deps = set() - - deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") - seen_deps = set() - for dep, _ in deps.items(): - if dep in seen_deps: - continue - - if dep not in providers: - continue - - (dep, _) = providers[dep] - - if not oe.packagedata.packaged(dep, localdata): - continue - - dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) - dep_pkg = dep_pkg_data["PKG"] - - if dep in dep_package_cache: - dep_spdx_package = dep_package_cache[dep] - else: - bb.debug(1, "Searching for %s" % dep_pkg) - dep_spdx_package, _ = oe.sbom30.find_root_obj_in_jsonld( - d, - "packages-staging", - dep_pkg, - oe.spdx30.software_Package, - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, - ) - dep_package_cache[dep] = dep_spdx_package - - runtime_spdx_deps.add(dep_spdx_package) - seen_deps.add(dep) - - if runtime_spdx_deps: - pkg_objset.new_scoped_relationship( - [spdx_package], - oe.spdx30.RelationshipType.dependsOn, - oe.spdx30.LifecycleScopeType.runtime, - [oe.sbom30.get_element_link_id(dep) for dep in runtime_spdx_deps], - ) - - oe.sbom30.write_recipe_jsonld_doc(d, pkg_objset, "packages", deploydir) - - oe.sbom30.write_recipe_jsonld_doc(d, common_objset, "common-package", deploydir) + import oe.spdx30_tasks + oe.spdx30_tasks.create_package_spdx(d) } - do_create_package_spdx[vardepsexclude] += "OVERRIDES SSTATE_ARCHS" addtask do_create_package_spdx after do_create_spdx before do_build do_rm_work @@ -955,91 +180,10 @@ do_create_package_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}" do_create_package_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}" do_create_package_spdx[rdeptask] = "do_create_spdx" - - python spdx30_build_started_handler () { - import oe.spdx30 - import oe.sbom30 - import oe.spdx_common - import os - from pathlib import Path - from datetime import datetime, timezone - - # Create a copy of the datastore. Set PN to "bitbake" so that SPDX IDs can - # be generated + import oe.spdx30_tasks d = e.data.createCopy() - d.setVar("PN", "bitbake") - d.setVar("BB_TASKHASH", "bitbake") - oe.spdx_common.load_spdx_license_data(d) - - deploy_dir_spdx = Path(e.data.getVar("DEPLOY_DIR_SPDX")) - - objset = oe.sbom30.ObjectSet.new_objset(d, "bitbake", False) - - host_import_key = d.getVar("SPDX_BUILD_HOST") - invoked_by = objset.new_agent("SPDX_INVOKED_BY", add=False) - on_behalf_of = objset.new_agent("SPDX_ON_BEHALF_OF", add=False) - - if d.getVar("SPDX_INCLUDE_BITBAKE_PARENT_BUILD") == "1": - # Since the Build objects are unique, we may as well set the creation - # time to the current time instead of the fallback SDE - objset.doc.creationInfo.created = datetime.now(timezone.utc) - - # Each invocation of bitbake should have a unique ID since it is a - # unique build - nonce = os.urandom(16).hex() - - build = objset.add_root(oe.spdx30.build_Build( - _id=objset.new_spdxid(nonce, include_unihash=False), - creationInfo=objset.doc.creationInfo, - build_buildType=oe.sbom30.SPDX_BUILD_TYPE, - )) - set_timestamp_now(d, build, "build_buildStartTime") - - if host_import_key: - objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.hasHost, - oe.spdx30.LifecycleScopeType.build, - [objset.new_import("SPDX_BUILD_HOST")], - ) - - if invoked_by: - objset.add(invoked_by) - invoked_by_spdx = objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.invokedBy, - oe.spdx30.LifecycleScopeType.build, - [invoked_by], - ) - - if on_behalf_of: - objset.add(on_behalf_of) - objset.new_scoped_relationship( - [on_behalf_of], - oe.spdx30.RelationshipType.delegatedTo, - oe.spdx30.LifecycleScopeType.build, - invoked_by_spdx, - ) - - elif on_behalf_of: - bb.warn("SPDX_ON_BEHALF_OF has no effect if SPDX_INVOKED_BY is not set") - - else: - if host_import_key: - bb.warn("SPDX_BUILD_HOST has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") - - if invoked_by: - bb.warn("SPDX_INVOKED_BY has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") - - if on_behalf_of: - bb.warn("SPDX_ON_BEHALF_OF has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set") - - for obj in objset.foreach_type(oe.spdx30.Element): - obj.extension.append(oe.sbom30.OELinkExtension(link_spdx_id=False)) - obj.extension.append(oe.sbom30.OEIdAliasExtension()) - - oe.sbom30.write_jsonld_doc(d, objset, deploy_dir_spdx / "bitbake.spdx.json") + oe.spdx30_tasks.write_bitbake_spdx(d) } addhandler spdx30_build_started_handler diff --git a/meta/classes/create-spdx-image-3.0.bbclass b/meta/classes/create-spdx-image-3.0.bbclass index 467719555d6..1cad8537d14 100644 --- a/meta/classes/create-spdx-image-3.0.bbclass +++ b/meta/classes/create-spdx-image-3.0.bbclass @@ -9,37 +9,6 @@ SPDX_ROOTFS_PACKAGES = "${SPDXDIR}/rootfs-packages.json" SPDXIMAGEDEPLOYDIR = "${SPDXDIR}/image-deploy" SPDXROOTFSDEPLOY = "${SPDXDIR}/rootfs-deploy" -def collect_build_package_inputs(d, objset, build, packages): - import oe.spdx_common - providers = oe.spdx_common.collect_package_providers(d) - - build_deps = set() - - for name in sorted(packages.keys()): - if name not in providers: - bb.fatal("Unable to find SPDX provider for '%s'" % name) - - pkg_name, pkg_hashfn = providers[name] - - # Copy all of the package SPDX files into the Sbom elements - pkg_spdx, _ = oe.sbom30.find_root_obj_in_jsonld( - d, - "packages", - pkg_name, - oe.spdx30.software_Package, - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, - ) - build_deps.add(pkg_spdx._id) - - if build_deps: - objset.new_scoped_relationship( - [build], - oe.spdx30.RelationshipType.hasInputs, - oe.spdx30.LifecycleScopeType.build, - sorted(list(build_deps)), - ) - - python spdx_collect_rootfs_packages() { import json from pathlib import Path @@ -58,44 +27,8 @@ python spdx_collect_rootfs_packages() { ROOTFS_POSTUNINSTALL_COMMAND =+ "spdx_collect_rootfs_packages" python do_create_rootfs_spdx() { - import json - from pathlib import Path - import oe.spdx30 - import oe.sbom30 - from datetime import datetime - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - deploydir = Path(d.getVar("SPDXROOTFSDEPLOY")) - root_packages_file = Path(d.getVar("SPDX_ROOTFS_PACKAGES")) - image_basename = d.getVar("IMAGE_BASENAME") - machine = d.getVar("MACHINE") - - with root_packages_file.open("r") as f: - packages = json.load(f) - - objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) - - rootfs = objset.add_root(oe.spdx30.software_Package( - _id=objset.new_spdxid("rootfs", image_basename), - creationInfo=objset.doc.creationInfo, - name=image_basename, - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, - )) - set_timestamp_now(d, rootfs, "builtTime") - - rootfs_build = objset.add_root(objset.new_task_build("rootfs", "rootfs")) - set_timestamp_now(d, rootfs_build, "build_buildEndTime") - - objset.new_scoped_relationship( - [rootfs_build], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - [rootfs], - ) - - collect_build_package_inputs(d, objset, rootfs_build, packages) - - oe.sbom30.write_recipe_jsonld_doc(d, objset, "rootfs", deploydir) + import oe.spdx30_tasks + oe.spdx30_tasks.create_rootfs_spdx(d) } addtask do_create_rootfs_spdx after do_rootfs before do_image SSTATETASKS += "do_create_rootfs_spdx" @@ -110,79 +43,8 @@ python do_create_rootfs_spdx_setscene() { addtask do_create_rootfs_spdx_setscene python do_create_image_spdx() { - import oe.spdx30 - import oe.sbom30 - import json - from pathlib import Path - - image_deploy_dir = Path(d.getVar('IMGDEPLOYDIR')) - manifest_path = Path(d.getVar("IMAGE_OUTPUT_MANIFEST")) - spdx_work_dir = Path(d.getVar('SPDXIMAGEWORK')) - - image_basename = d.getVar('IMAGE_BASENAME') - machine = d.getVar("MACHINE") - - objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) - - with manifest_path.open("r") as f: - manifest = json.load(f) - - builds = [] - for task in manifest: - imagetype = task["imagetype"] - taskname = task["taskname"] - - image_build = objset.add_root(objset.new_task_build(taskname, "image/%s" % imagetype)) - set_timestamp_now(d, image_build, "build_buildEndTime") - builds.append(image_build) - - artifacts = [] - - for image in task["images"]: - image_filename = image["filename"] - image_path = image_deploy_dir / image_filename - a = objset.add_root(oe.spdx30.software_File( - _id=objset.new_spdxid("image", image_filename), - creationInfo=objset.doc.creationInfo, - name=image_filename, - verifiedUsing=[ - oe.spdx30.Hash( - algorithm=oe.spdx30.HashAlgorithm.sha256, - hashValue=bb.utils.sha256_file(image_path), - ) - ] - )) - set_purposes(d, a, "SPDX_IMAGE_PURPOSE:%s" % imagetype, "SPDX_IMAGE_PURPOSE") - set_timestamp_now(d, a, "builtTime") - - artifacts.append(a) - - if artifacts: - objset.new_scoped_relationship( - [image_build], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - artifacts, - ) - - if builds: - rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( - d, - "rootfs", - "%s-%s" % (image_basename, machine), - oe.spdx30.software_Package, - # TODO: Should use a purpose to filter here? - ) - objset.new_scoped_relationship( - builds, - oe.spdx30.RelationshipType.hasInputs, - oe.spdx30.LifecycleScopeType.build, - [rootfs_image._id], - ) - - objset.add_aliases() - objset.link() - oe.sbom30.write_recipe_jsonld_doc(d, objset, "image", spdx_work_dir) + import oe.spdx30_tasks + oe.spdx30_tasks.create_image_spdx(d) } addtask do_create_image_spdx after do_image_complete do_create_rootfs_spdx before do_build SSTATETASKS += "do_create_image_spdx" @@ -199,46 +61,8 @@ addtask do_create_image_spdx_setscene python do_create_image_sbom_spdx() { - import os - from pathlib import Path - import oe.spdx30 - import oe.sbom30 - - image_name = d.getVar("IMAGE_NAME") - image_basename = d.getVar("IMAGE_BASENAME") - image_link_name = d.getVar("IMAGE_LINK_NAME") - imgdeploydir = Path(d.getVar("SPDXIMAGEDEPLOYDIR")) - machine = d.getVar("MACHINE") - - spdx_path = imgdeploydir / (image_name + ".spdx.json") - - root_elements = [] - - # TODO: Do we need to add the rootfs or are the image files sufficient? - rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( - d, - "rootfs", - "%s-%s" % (image_basename, machine), - oe.spdx30.software_Package, - # TODO: Should use a purpose here? - ) - root_elements.append(rootfs_image._id) - - image_objset, _ = oe.sbom30.find_jsonld(d, "image", "%s-%s" % (image_basename, machine), required=True) - for o in image_objset.foreach_root(oe.spdx30.software_File): - root_elements.append(o._id) - - objset, sbom = oe.sbom30.create_sbom(d, image_name, root_elements) - - oe.sbom30.write_jsonld_doc(d, objset, spdx_path) - - def make_image_link(target_path, suffix): - if image_link_name: - link = imgdeploydir / (image_link_name + suffix) - if link != target_path: - link.symlink_to(os.path.relpath(target_path, link.parent)) - - make_image_link(spdx_path, ".spdx.json") + import oe.spdx30_tasks + oe.spdx30_tasks.create_image_sbom_spdx(d) } addtask do_create_image_sbom_spdx after do_create_rootfs_spdx do_create_image_spdx before do_build SSTATETASKS += "do_create_image_sbom_spdx" @@ -268,149 +92,54 @@ POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk-ext = " sdk_ext_target python sdk_host_create_spdx() { from pathlib import Path + import oe.spdx30_tasks spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) - sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) + oe.spdx30_tasks.sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) } python sdk_target_create_spdx() { from pathlib import Path + import oe.spdx30_tasks spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) - sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) + oe.spdx30_tasks.sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) } python sdk_ext_host_create_spdx() { from pathlib import Path + import oe.spdx30_tasks spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) # TODO: This doesn't seem to work - sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) + oe.spdx30_tasks.sdk_create_spdx(d, "host", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) } python sdk_ext_target_create_spdx() { from pathlib import Path + import oe.spdx30_tasks spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) # TODO: This doesn't seem to work - sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) + oe.spdx30_tasks.sdk_create_spdx(d, "target", spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) } -def sdk_create_spdx(d, sdk_type, spdx_work_dir, toolchain_outputname): - from pathlib import Path - from oe.sdk import sdk_list_installed_packages - import oe.spdx30 - import oe.sbom30 - from datetime import datetime - - sdk_name = toolchain_outputname + "-" + sdk_type - sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") - - objset = oe.sbom30.ObjectSet.new_objset(d, sdk_name) - - sdk_rootfs = objset.add_root(oe.spdx30.software_Package( - _id=objset.new_spdxid("sdk-rootfs", sdk_name), - creationInfo=objset.doc.creationInfo, - name=sdk_name, - software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, - )) - set_timestamp_now(d, sdk_rootfs, "builtTime") - - sdk_build = objset.add_root(objset.new_task_build("sdk-rootfs", "sdk-rootfs")) - set_timestamp_now(d, sdk_build, "build_buildEndTime") - - objset.new_scoped_relationship( - [sdk_build], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - [sdk_rootfs], - ) - - collect_build_package_inputs(d, objset, sdk_build, sdk_packages) - - objset.add_aliases() - oe.sbom30.write_jsonld_doc(d, objset, spdx_work_dir / "sdk-rootfs.spdx.json") python sdk_create_sbom() { from pathlib import Path + import oe.spdx30_tasks sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) spdx_work_dir = Path(d.getVar('SPDXSDKWORK')) - create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) + oe.spdx30_tasks.create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAIN_OUTPUTNAME")) } python sdk_ext_create_sbom() { from pathlib import Path + import oe.spdx30_tasks sdk_deploydir = Path(d.getVar("SDKEXTDEPLOYDIR")) spdx_work_dir = Path(d.getVar('SPDXSDKEXTWORK')) - create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) + oe.spdx30_tasks.create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, d.getVar("TOOLCHAINEXT_OUTPUTNAME")) } -def create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, toolchain_outputname): - import oe.spdx30 - import oe.sbom30 - from pathlib import Path - from datetime import datetime - - # Load the document written earlier - rootfs_objset = oe.sbom30.load_jsonld(d, spdx_work_dir / "sdk-rootfs.spdx.json", required=True) - - # Create a new build for the SDK installer - sdk_build = rootfs_objset.new_task_build("sdk-populate", "sdk-populate") - set_timestamp_now(d, sdk_build, "build_buildEndTime") - - rootfs = rootfs_objset.find_root(oe.spdx30.software_Package) - if rootfs is None: - bb.fatal("Unable to find rootfs artifact") - - rootfs_objset.new_scoped_relationship( - [sdk_build], - oe.spdx30.RelationshipType.hasInputs, - oe.spdx30.LifecycleScopeType.build, - [rootfs] - ) - - files = set() - root_files = [] - - # NOTE: os.walk() doesn't return symlinks - for dirpath, dirnames, filenames in os.walk(sdk_deploydir): - for fn in filenames: - fpath = Path(dirpath) / fn - if not fpath.is_file() or fpath.is_symlink(): - continue - - relpath = str(fpath.relative_to(sdk_deploydir)) - - f = rootfs_objset.new_file( - rootfs_objset.new_spdxid("sdk-installer", relpath), - relpath, - fpath, - ) - set_timestamp_now(d, f, "builtTime") - - if fn.endswith(".manifest"): - f.software_primaryPurpose = oe.spdx30.software_SoftwarePurpose.manifest - elif fn.endswith(".testdata.json"): - f.software_primaryPurpose = oe.spdx30.software_SoftwarePurpose.configuration - else: - set_purposes(d, f, "SPDX_SDK_PURPOSE") - root_files.append(f) - - files.add(f) - - if files: - rootfs_objset.new_scoped_relationship( - [sdk_build], - oe.spdx30.RelationshipType.hasOutputs, - oe.spdx30.LifecycleScopeType.build, - files, - ) - else: - bb.warn(f"No SDK output files found in {sdk_deploydir}") - - objset, sbom = oe.sbom30.create_sbom(d, toolchain_outputname, sorted(list(files)), [rootfs_objset]) - - oe.sbom30.write_jsonld_doc(d, objset, sdk_deploydir / (toolchain_outputname + ".spdx.json")) - diff --git a/meta/lib/oe/spdx30_tasks.py b/meta/lib/oe/spdx30_tasks.py new file mode 100644 index 00000000000..59fd8750744 --- /dev/null +++ b/meta/lib/oe/spdx30_tasks.py @@ -0,0 +1,1229 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +import json +import oe.cve_check +import oe.packagedata +import oe.patch +import oe.sbom30 +import oe.spdx30 +import oe.spdx_common +import oe.sdk +import os + +from contextlib import contextmanager +from datetime import datetime, timezone +from pathlib import Path + + +def set_timestamp_now(d, o, prop): + if d.getVar("SPDX_INCLUDE_TIMESTAMPS") == "1": + setattr(o, prop, datetime.now(timezone.utc)) + else: + # Doing this helps to validated that the property actually exists, and + # also that it is not mandatory + delattr(o, prop) + + +def add_license_expression(d, objset, license_expression): + license_data = d.getVar("SPDX_LICENSE_DATA") + simple_license_text = {} + license_text_map = {} + license_ref_idx = 0 + + def add_license_text(name): + nonlocal objset + nonlocal simple_license_text + + if name in simple_license_text: + return simple_license_text[name] + + lic = objset.find_filter( + oe.spdx30.simplelicensing_SimpleLicensingText, + name=name, + ) + + if lic is not None: + simple_license_text[name] = lic + return lic + + lic = objset.add( + oe.spdx30.simplelicensing_SimpleLicensingText( + _id=objset.new_spdxid("license-text", name), + creationInfo=objset.doc.creationInfo, + name=name, + ) + ) + simple_license_text[name] = lic + + if name == "PD": + lic.simplelicensing_licenseText = "Software released to the public domain" + return lic + + # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH + for directory in [d.getVar("COMMON_LICENSE_DIR")] + ( + d.getVar("LICENSE_PATH") or "" + ).split(): + try: + with (Path(directory) / name).open(errors="replace") as f: + lic.simplelicensing_licenseText = f.read() + return lic + + except FileNotFoundError: + pass + + # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set + filename = d.getVarFlag("NO_GENERIC_LICENSE", name) + if filename: + filename = d.expand("${S}/" + filename) + with open(filename, errors="replace") as f: + lic.simplelicensing_licenseText = f.read() + return lic + else: + bb.fatal("Cannot find any text for license %s" % name) + + def convert(l): + nonlocal license_text_map + nonlocal license_ref_idx + + if l == "(" or l == ")": + return l + + if l == "&": + return "AND" + + if l == "|": + return "OR" + + if l == "CLOSED": + return "NONE" + + spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l + if spdx_license in license_data["licenses"]: + return spdx_license + + spdx_license = "LicenseRef-" + l + if spdx_license not in license_text_map: + license_text_map[spdx_license] = add_license_text(l)._id + + return spdx_license + + lic_split = ( + license_expression.replace("(", " ( ") + .replace(")", " ) ") + .replace("|", " | ") + .replace("&", " & ") + .split() + ) + spdx_license_expression = " ".join(convert(l) for l in lic_split) + + return objset.new_license_expression(spdx_license_expression, license_text_map) + + +def add_package_files( + d, + objset, + topdir, + get_spdxid, + get_purposes, + *, + archive=None, + ignore_dirs=[], + ignore_top_level_dirs=[], +): + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + if source_date_epoch: + source_date_epoch = int(source_date_epoch) + + spdx_files = set() + + file_counter = 1 + for subdir, dirs, files in os.walk(topdir): + dirs[:] = [d for d in dirs if d not in ignore_dirs] + if subdir == str(topdir): + dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] + + for file in files: + filepath = Path(subdir) / file + if filepath.is_symlink() or not filepath.is_file(): + continue + + bb.debug(1, "Adding file %s to %s" % (filepath, objset.doc._id)) + + filename = str(filepath.relative_to(topdir)) + file_purposes = get_purposes(filepath) + + spdx_file = objset.new_file( + get_spdxid(file_counter), + filename, + filepath, + purposes=file_purposes, + ) + spdx_files.add(spdx_file) + + if oe.spdx30.software_SoftwarePurpose.source in file_purposes: + objset.scan_declared_licenses(spdx_file, filepath) + + if archive is not None: + with filepath.open("rb") as f: + info = archive.gettarinfo(fileobj=f) + info.name = filename + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > source_date_epoch: + info.mtime = source_date_epoch + + archive.addfile(info, f) + + file_counter += 1 + + return spdx_files + + +def get_package_sources_from_debug( + d, package, package_files, sources, source_hash_cache +): + def file_path_match(file_path, pkg_file): + if file_path.lstrip("/") == pkg_file.name.lstrip("/"): + return True + + for e in pkg_file.extension: + if isinstance(e, oe.sbom30.OEFileNameAliasExtension): + for a in e.aliases: + if file_path.lstrip("/") == a.lstrip("/"): + return True + + return False + + debug_search_paths = [ + Path(d.getVar("PKGD")), + Path(d.getVar("STAGING_DIR_TARGET")), + Path(d.getVar("STAGING_DIR_NATIVE")), + Path(d.getVar("STAGING_KERNEL_DIR")), + ] + + pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) + + if pkg_data is None: + return + + dep_source_files = set() + + for file_path, file_data in pkg_data["files_info"].items(): + if not "debugsrc" in file_data: + continue + + if not any(file_path_match(file_path, pkg_file) for pkg_file in package_files): + bb.fatal( + "No package file found for %s in %s; SPDX found: %s" + % (str(file_path), package, " ".join(p.name for p in package_files)) + ) + continue + + for debugsrc in file_data["debugsrc"]: + for search in debug_search_paths: + if debugsrc.startswith("/usr/src/kernel"): + debugsrc_path = search / debugsrc.replace("/usr/src/kernel/", "") + else: + debugsrc_path = search / debugsrc.lstrip("/") + + if debugsrc_path in source_hash_cache: + file_sha256 = source_hash_cache[debugsrc_path] + if file_sha256 is None: + continue + else: + if not debugsrc_path.exists(): + source_hash_cache[debugsrc_path] = None + continue + + file_sha256 = bb.utils.sha256_file(debugsrc_path) + source_hash_cache[debugsrc_path] = file_sha256 + + if file_sha256 in sources: + source_file = sources[file_sha256] + dep_source_files.add(source_file) + else: + bb.debug( + 1, + "Debug source %s with SHA256 %s not found in any dependency" + % (str(debugsrc_path), file_sha256), + ) + break + else: + bb.debug(1, "Debug source %s not found" % debugsrc) + + return dep_source_files + + +def collect_dep_objsets(d, build): + deps = oe.spdx_common.get_spdx_deps(d) + + dep_objsets = [] + dep_builds = set() + + dep_build_spdxids = set() + for dep in deps: + bb.debug(1, "Fetching SPDX for dependency %s" % (dep.pn)) + dep_build, dep_objset = oe.sbom30.find_root_obj_in_jsonld( + d, "recipes", dep.pn, oe.spdx30.build_Build + ) + # If the dependency is part of the taskhash, return it to be linked + # against. Otherwise, it cannot be linked against because this recipe + # will not rebuilt if dependency changes + if dep.in_taskhash: + dep_objsets.append(dep_objset) + + # The build _can_ be linked against (by alias) + dep_builds.add(dep_build) + + return dep_objsets, dep_builds + + +def collect_dep_sources(dep_objsets): + sources = {} + for objset in dep_objsets: + # Don't collect sources from native recipes as they + # match non-native sources also. + if objset.is_native(): + continue + + bb.debug(1, "Fetching Sources for dependency %s" % (objset.doc.name)) + + dep_build = objset.find_root(oe.spdx30.build_Build) + if not dep_build: + bb.fatal("Unable to find a build") + + for e in objset.foreach_type(oe.spdx30.Relationship): + if dep_build is not e.from_: + continue + + if e.relationshipType != oe.spdx30.RelationshipType.hasInputs: + continue + + for to in e.to: + if not isinstance(to, oe.spdx30.software_File): + continue + + if ( + to.software_primaryPurpose + != oe.spdx30.software_SoftwarePurpose.source + ): + continue + + for v in to.verifiedUsing: + if v.algorithm == oe.spdx30.HashAlgorithm.sha256: + sources[v.hashValue] = to + break + else: + bb.fatal( + "No SHA256 found for %s in %s" % (to.name, objset.doc.name) + ) + + return sources + + +def add_download_files(d, objset): + inputs = set() + + urls = d.getVar("SRC_URI").split() + fetch = bb.fetch2.Fetch(urls, d) + + for download_idx, src_uri in enumerate(urls): + fd = fetch.ud[src_uri] + + for name in fd.names: + file_name = os.path.basename(fetch.localpath(src_uri)) + if oe.patch.patch_path(src_uri, fetch, "", expand=False): + primary_purpose = oe.spdx30.software_SoftwarePurpose.patch + else: + primary_purpose = oe.spdx30.software_SoftwarePurpose.source + + if fd.type == "file": + if os.path.isdir(fd.localpath): + walk_idx = 1 + for root, dirs, files in os.walk(fd.localpath): + for f in files: + f_path = os.path.join(root, f) + if os.path.islink(f_path): + # TODO: SPDX doesn't support symlinks yet + continue + + file = objset.new_file( + objset.new_spdxid( + "source", str(download_idx + 1), str(walk_idx) + ), + os.path.join( + file_name, os.path.relpath(f_path, fd.localpath) + ), + f_path, + purposes=[primary_purpose], + ) + + inputs.add(file) + walk_idx += 1 + + else: + file = objset.new_file( + objset.new_spdxid("source", str(download_idx + 1)), + file_name, + fd.localpath, + purposes=[primary_purpose], + ) + inputs.add(file) + + else: + uri = fd.type + proto = getattr(fd, "proto", None) + if proto is not None: + uri = uri + "+" + proto + uri = uri + "://" + fd.host + fd.path + + if fd.method.supports_srcrev(): + uri = uri + "@" + fd.revisions[name] + + dl = objset.add( + oe.spdx30.software_Package( + _id=objset.new_spdxid("source", str(download_idx + 1)), + creationInfo=objset.doc.creationInfo, + name=file_name, + software_primaryPurpose=primary_purpose, + software_downloadLocation=uri, + ) + ) + + if fd.method.supports_checksum(fd): + # TODO Need something better than hard coding this + for checksum_id in ["sha256", "sha1"]: + expected_checksum = getattr( + fd, "%s_expected" % checksum_id, None + ) + if expected_checksum is None: + continue + + dl.verifiedUsing.append( + oe.spdx30.Hash( + algorithm=getattr(oe.spdx30.HashAlgorithm, checksum_id), + hashValue=expected_checksum, + ) + ) + + inputs.add(dl) + + return inputs + + +def set_purposes(d, element, *var_names, force_purposes=[]): + purposes = force_purposes[:] + + for var_name in var_names: + val = d.getVar(var_name) + if val: + purposes.extend(val.split()) + break + + if not purposes: + bb.warn("No SPDX purposes found in %s" % " ".join(var_names)) + return + + element.software_primaryPurpose = getattr( + oe.spdx30.software_SoftwarePurpose, purposes[0] + ) + element.software_additionalPurpose = [ + getattr(oe.spdx30.software_SoftwarePurpose, p) for p in purposes[1:] + ] + + +def create_spdx(d): + def set_var_field(var, obj, name, package=None): + val = None + if package: + val = d.getVar("%s:%s" % (var, package)) + + if not val: + val = d.getVar(var) + + if val: + setattr(obj, name, val) + + deploydir = Path(d.getVar("SPDXDEPLOY")) + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_workdir = Path(d.getVar("SPDXWORK")) + include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" + pkg_arch = d.getVar("SSTATE_PKGARCH") + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( + "cross", d + ) + include_vex = d.getVar("SPDX_INCLUDE_VEX") + if not include_vex in ("none", "current", "all"): + bb.fatal("SPDX_INCLUDE_VEX must be one of 'none', 'current', 'all'") + + build_objset = oe.sbom30.ObjectSet.new_objset(d, d.getVar("PN")) + + build = build_objset.new_task_build("recipe", "recipe") + build_objset.doc.rootElement.append(build) + + build_objset.set_is_native(is_native) + + for var in (d.getVar("SPDX_CUSTOM_ANNOTATION_VARS") or "").split(): + new_annotation( + d, + build_objset, + build, + "%s=%s" % (var, d.getVar(var)), + oe.spdx30.AnnotationType.other, + ) + + build_inputs = set() + + # Add CVEs + cve_by_status = {} + if include_vex != "none": + for cve in d.getVarFlags("CVE_STATUS") or {}: + status, detail, description = oe.cve_check.decode_cve_status(d, cve) + + # If this CVE is fixed upstream, skip it unless all CVEs are + # specified. + if include_vex != "all" and detail in ( + "fixed-version", + "cpe-stable-backport", + ): + bb.debug(1, "Skipping %s since it is already fixed upstream" % cve) + continue + + cve_by_status.setdefault(status, {})[cve] = ( + build_objset.new_cve_vuln(cve), + detail, + description, + ) + + cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) + + source_files = add_download_files(d, build_objset) + build_inputs |= source_files + + recipe_spdx_license = add_license_expression(d, build_objset, d.getVar("LICENSE")) + build_objset.new_relationship( + source_files, + oe.spdx30.RelationshipType.hasConcludedLicense, + [recipe_spdx_license], + ) + + if oe.spdx_common.process_sources(d) and include_sources: + bb.debug(1, "Adding source files to SPDX") + oe.spdx_common.get_patched_src(d) + + build_inputs |= add_package_files( + d, + build_objset, + spdx_workdir, + lambda file_counter: build_objset.new_spdxid( + "sourcefile", str(file_counter) + ), + lambda filepath: [oe.spdx30.software_SoftwarePurpose.source], + ignore_dirs=[".git"], + ignore_top_level_dirs=["temp"], + archive=None, + ) + + dep_objsets, dep_builds = collect_dep_objsets(d, build) + if dep_builds: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.dependsOn, + oe.spdx30.LifecycleScopeType.build, + sorted(oe.sbom30.get_element_link_id(b) for b in dep_builds), + ) + + debug_source_ids = set() + source_hash_cache = {} + + # Write out the package SPDX data now. It is not complete as we cannot + # write the runtime data, so write it to a staging area and a later task + # will write out the final collection + + # TODO: Handle native recipe output + if not is_native: + bb.debug(1, "Collecting Dependency sources files") + sources = collect_dep_sources(dep_objsets) + + bb.build.exec_func("read_subpackage_metadata", d) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + if not oe.packagedata.packaged(package, d): + continue + + pkg_name = d.getVar("PKG:%s" % package) or package + + bb.debug(1, "Creating SPDX for package %s" % pkg_name) + + pkg_objset = oe.sbom30.ObjectSet.new_objset(d, pkg_name) + + spdx_package = pkg_objset.add_root( + oe.spdx30.software_Package( + _id=pkg_objset.new_spdxid("package", pkg_name), + creationInfo=pkg_objset.doc.creationInfo, + name=pkg_name, + software_packageVersion=d.getVar("PV"), + ) + ) + set_timestamp_now(d, spdx_package, "builtTime") + + set_purposes( + d, + spdx_package, + "SPDX_PACKAGE_ADDITIONAL_PURPOSE:%s" % package, + "SPDX_PACKAGE_ADDITIONAL_PURPOSE", + force_purposes=["install"], + ) + + supplier = build_objset.new_agent("SPDX_PACKAGE_SUPPLIER") + if supplier is not None: + spdx_package.supplier = ( + supplier if isinstance(supplier, str) else supplier._id + ) + + set_var_field( + "HOMEPAGE", spdx_package, "software_homePage", package=package + ) + set_var_field("SUMMARY", spdx_package, "summary", package=package) + set_var_field("DESCRIPTION", spdx_package, "description", package=package) + + pkg_objset.new_scoped_relationship( + [build._id], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [spdx_package], + ) + + for cpe_id in cpe_ids: + spdx_package.externalIdentifier.append( + oe.spdx30.ExternalIdentifier( + externalIdentifierType=oe.spdx30.ExternalIdentifierType.cpe23, + identifier=cpe_id, + ) + ) + + # TODO: Generate a file for each actual IPK/DEB/RPM/TGZ file + # generated and link it to the package + # spdx_package_file = pkg_objset.add(oe.spdx30.software_File( + # _id=pkg_objset.new_spdxid("distribution", pkg_name), + # creationInfo=pkg_objset.doc.creationInfo, + # name=pkg_name, + # software_primaryPurpose=spdx_package.software_primaryPurpose, + # software_additionalPurpose=spdx_package.software_additionalPurpose, + # )) + # set_timestamp_now(d, spdx_package_file, "builtTime") + + ## TODO add hashes + # pkg_objset.new_relationship( + # [spdx_package], + # oe.spdx30.RelationshipType.hasDistributionArtifact, + # [spdx_package_file], + # ) + + # NOTE: licenses live in the recipe collection and are referenced + # by ID in the package collection(s). This helps reduce duplication + # (since a lot of packages will have the same license), and also + # prevents duplicate license SPDX IDs in the packages + package_license = d.getVar("LICENSE:%s" % package) + if package_license and package_license != d.getVar("LICENSE"): + package_spdx_license = add_license_expression( + d, build_objset, package_license + ) + else: + package_spdx_license = recipe_spdx_license + + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.hasConcludedLicense, + [package_spdx_license._id], + ) + + # NOTE: CVE Elements live in the recipe collection + all_cves = set() + for status, cves in cve_by_status.items(): + for cve, items in cves.items(): + spdx_cve, detail, description = items + + all_cves.add(spdx_cve._id) + + if status == "Patched": + pkg_objset.new_vex_patched_relationship( + [spdx_cve._id], [spdx_package] + ) + elif status == "Unpatched": + pkg_objset.new_vex_unpatched_relationship( + [spdx_cve._id], [spdx_package] + ) + elif status == "Ignored": + spdx_vex = pkg_objset.new_vex_ignored_relationship( + [spdx_cve._id], + [spdx_package], + impact_statement=description, + ) + + if detail in ( + "ignored", + "cpe-incorrect", + "disputed", + "upstream-wontfix", + ): + # VEX doesn't have justifications for this + pass + elif detail in ( + "not-applicable-config", + "not-applicable-platform", + ): + for v in spdx_vex: + v.security_justificationType = ( + oe.spdx30.security_VexJustificationType.vulnerableCodeNotPresent + ) + else: + bb.fatal(f"Unknown detail '{detail}' for ignored {cve}") + else: + bb.fatal(f"Unknown CVE status {status}") + + if all_cves: + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.hasAssociatedVulnerability, + sorted(list(all_cves)), + ) + + bb.debug(1, "Adding package files to SPDX for package %s" % pkg_name) + package_files = add_package_files( + d, + pkg_objset, + pkgdest / package, + lambda file_counter: pkg_objset.new_spdxid( + "package", pkg_name, "file", str(file_counter) + ), + # TODO: Can we know the purpose here? + lambda filepath: [], + ignore_top_level_dirs=["CONTROL", "DEBIAN"], + archive=None, + ) + + if package_files: + pkg_objset.new_relationship( + [spdx_package], + oe.spdx30.RelationshipType.contains, + sorted(list(package_files)), + ) + + if include_sources: + debug_sources = get_package_sources_from_debug( + d, package, package_files, sources, source_hash_cache + ) + debug_source_ids |= set( + oe.sbom30.get_element_link_id(d) for d in debug_sources + ) + + oe.sbom30.write_recipe_jsonld_doc( + d, pkg_objset, "packages-staging", deploydir, create_spdx_id_links=False + ) + + if include_sources: + bb.debug(1, "Adding sysroot files to SPDX") + sysroot_files = add_package_files( + d, + build_objset, + d.expand("${COMPONENTS_DIR}/${PACKAGE_ARCH}/${PN}"), + lambda file_counter: build_objset.new_spdxid("sysroot", str(file_counter)), + lambda filepath: [], + archive=None, + ) + + if sysroot_files: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(sysroot_files)), + ) + + if build_inputs or debug_source_ids: + build_objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(build_inputs)) + sorted(list(debug_source_ids)), + ) + + oe.sbom30.write_recipe_jsonld_doc(d, build_objset, "recipes", deploydir) + + +def create_package_spdx(d): + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + deploydir = Path(d.getVar("SPDXRUNTIMEDEPLOY")) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class( + "cross", d + ) + + providers = oe.spdx_common.collect_package_providers(d) + pkg_arch = d.getVar("SSTATE_PKGARCH") + + if is_native: + return + + bb.build.exec_func("read_subpackage_metadata", d) + + dep_package_cache = {} + + # Any element common to all packages that need to be referenced by ID + # should be written into this objset set + common_objset = oe.sbom30.ObjectSet.new_objset( + d, "%s-package-common" % d.getVar("PN") + ) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + localdata = bb.data.createCopy(d) + pkg_name = d.getVar("PKG:%s" % package) or package + localdata.setVar("PKG", pkg_name) + localdata.setVar("OVERRIDES", d.getVar("OVERRIDES", False) + ":" + package) + + if not oe.packagedata.packaged(package, localdata): + continue + + spdx_package, pkg_objset = oe.sbom30.load_obj_in_jsonld( + d, + pkg_arch, + "packages-staging", + pkg_name, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + + # We will write out a new collection, so link it to the new + # creation info in the common package data. The old creation info + # should still exist and be referenced by all the existing elements + # in the package + pkg_objset.creationInfo = pkg_objset.copy_creation_info( + common_objset.doc.creationInfo + ) + + runtime_spdx_deps = set() + + deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") + seen_deps = set() + for dep, _ in deps.items(): + if dep in seen_deps: + continue + + if dep not in providers: + continue + + (dep, _) = providers[dep] + + if not oe.packagedata.packaged(dep, localdata): + continue + + dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) + dep_pkg = dep_pkg_data["PKG"] + + if dep in dep_package_cache: + dep_spdx_package = dep_package_cache[dep] + else: + bb.debug(1, "Searching for %s" % dep_pkg) + dep_spdx_package, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "packages-staging", + dep_pkg, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + dep_package_cache[dep] = dep_spdx_package + + runtime_spdx_deps.add(dep_spdx_package) + seen_deps.add(dep) + + if runtime_spdx_deps: + pkg_objset.new_scoped_relationship( + [spdx_package], + oe.spdx30.RelationshipType.dependsOn, + oe.spdx30.LifecycleScopeType.runtime, + [oe.sbom30.get_element_link_id(dep) for dep in runtime_spdx_deps], + ) + + oe.sbom30.write_recipe_jsonld_doc(d, pkg_objset, "packages", deploydir) + + oe.sbom30.write_recipe_jsonld_doc(d, common_objset, "common-package", deploydir) + + +def write_bitbake_spdx(d): + # Set PN to "bitbake" so that SPDX IDs can be generated + d.setVar("PN", "bitbake") + d.setVar("BB_TASKHASH", "bitbake") + oe.spdx_common.load_spdx_license_data(d) + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + + objset = oe.sbom30.ObjectSet.new_objset(d, "bitbake", False) + + host_import_key = d.getVar("SPDX_BUILD_HOST") + invoked_by = objset.new_agent("SPDX_INVOKED_BY", add=False) + on_behalf_of = objset.new_agent("SPDX_ON_BEHALF_OF", add=False) + + if d.getVar("SPDX_INCLUDE_BITBAKE_PARENT_BUILD") == "1": + # Since the Build objects are unique, we may as well set the creation + # time to the current time instead of the fallback SDE + objset.doc.creationInfo.created = datetime.now(timezone.utc) + + # Each invocation of bitbake should have a unique ID since it is a + # unique build + nonce = os.urandom(16).hex() + + build = objset.add_root( + oe.spdx30.build_Build( + _id=objset.new_spdxid(nonce, include_unihash=False), + creationInfo=objset.doc.creationInfo, + build_buildType=oe.sbom30.SPDX_BUILD_TYPE, + ) + ) + set_timestamp_now(d, build, "build_buildStartTime") + + if host_import_key: + objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasHost, + oe.spdx30.LifecycleScopeType.build, + [objset.new_import("SPDX_BUILD_HOST")], + ) + + if invoked_by: + objset.add(invoked_by) + invoked_by_spdx = objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.invokedBy, + oe.spdx30.LifecycleScopeType.build, + [invoked_by], + ) + + if on_behalf_of: + objset.add(on_behalf_of) + objset.new_scoped_relationship( + [on_behalf_of], + oe.spdx30.RelationshipType.delegatedTo, + oe.spdx30.LifecycleScopeType.build, + invoked_by_spdx, + ) + + elif on_behalf_of: + bb.warn("SPDX_ON_BEHALF_OF has no effect if SPDX_INVOKED_BY is not set") + + else: + if host_import_key: + bb.warn( + "SPDX_BUILD_HOST has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set" + ) + + if invoked_by: + bb.warn( + "SPDX_INVOKED_BY has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set" + ) + + if on_behalf_of: + bb.warn( + "SPDX_ON_BEHALF_OF has no effect if SPDX_INCLUDE_BITBAKE_PARENT_BUILD is not set" + ) + + for obj in objset.foreach_type(oe.spdx30.Element): + obj.extension.append(oe.sbom30.OELinkExtension(link_spdx_id=False)) + obj.extension.append(oe.sbom30.OEIdAliasExtension()) + + oe.sbom30.write_jsonld_doc(d, objset, deploy_dir_spdx / "bitbake.spdx.json") + + +def collect_build_package_inputs(d, objset, build, packages): + providers = oe.spdx_common.collect_package_providers(d) + + build_deps = set() + + for name in sorted(packages.keys()): + if name not in providers: + bb.fatal("Unable to find SPDX provider for '%s'" % name) + + pkg_name, pkg_hashfn = providers[name] + + # Copy all of the package SPDX files into the Sbom elements + pkg_spdx, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "packages", + pkg_name, + oe.spdx30.software_Package, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.install, + ) + build_deps.add(pkg_spdx._id) + + if build_deps: + objset.new_scoped_relationship( + [build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + sorted(list(build_deps)), + ) + + +def create_rootfs_spdx(d): + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + deploydir = Path(d.getVar("SPDXROOTFSDEPLOY")) + root_packages_file = Path(d.getVar("SPDX_ROOTFS_PACKAGES")) + image_basename = d.getVar("IMAGE_BASENAME") + machine = d.getVar("MACHINE") + + with root_packages_file.open("r") as f: + packages = json.load(f) + + objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) + + rootfs = objset.add_root( + oe.spdx30.software_Package( + _id=objset.new_spdxid("rootfs", image_basename), + creationInfo=objset.doc.creationInfo, + name=image_basename, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, + ) + ) + set_timestamp_now(d, rootfs, "builtTime") + + rootfs_build = objset.add_root(objset.new_task_build("rootfs", "rootfs")) + set_timestamp_now(d, rootfs_build, "build_buildEndTime") + + objset.new_scoped_relationship( + [rootfs_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs], + ) + + collect_build_package_inputs(d, objset, rootfs_build, packages) + + oe.sbom30.write_recipe_jsonld_doc(d, objset, "rootfs", deploydir) + + +def create_image_spdx(d): + image_deploy_dir = Path(d.getVar("IMGDEPLOYDIR")) + manifest_path = Path(d.getVar("IMAGE_OUTPUT_MANIFEST")) + spdx_work_dir = Path(d.getVar("SPDXIMAGEWORK")) + + image_basename = d.getVar("IMAGE_BASENAME") + machine = d.getVar("MACHINE") + + objset = oe.sbom30.ObjectSet.new_objset(d, "%s-%s" % (image_basename, machine)) + + with manifest_path.open("r") as f: + manifest = json.load(f) + + builds = [] + for task in manifest: + imagetype = task["imagetype"] + taskname = task["taskname"] + + image_build = objset.add_root( + objset.new_task_build(taskname, "image/%s" % imagetype) + ) + set_timestamp_now(d, image_build, "build_buildEndTime") + builds.append(image_build) + + artifacts = [] + + for image in task["images"]: + image_filename = image["filename"] + image_path = image_deploy_dir / image_filename + a = objset.add_root( + oe.spdx30.software_File( + _id=objset.new_spdxid("image", image_filename), + creationInfo=objset.doc.creationInfo, + name=image_filename, + verifiedUsing=[ + oe.spdx30.Hash( + algorithm=oe.spdx30.HashAlgorithm.sha256, + hashValue=bb.utils.sha256_file(image_path), + ) + ], + ) + ) + set_purposes( + d, a, "SPDX_IMAGE_PURPOSE:%s" % imagetype, "SPDX_IMAGE_PURPOSE" + ) + set_timestamp_now(d, a, "builtTime") + + artifacts.append(a) + + if artifacts: + objset.new_scoped_relationship( + [image_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + artifacts, + ) + + if builds: + rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "rootfs", + "%s-%s" % (image_basename, machine), + oe.spdx30.software_Package, + # TODO: Should use a purpose to filter here? + ) + objset.new_scoped_relationship( + builds, + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs_image._id], + ) + + objset.add_aliases() + objset.link() + oe.sbom30.write_recipe_jsonld_doc(d, objset, "image", spdx_work_dir) + + +def create_image_sbom_spdx(d): + image_name = d.getVar("IMAGE_NAME") + image_basename = d.getVar("IMAGE_BASENAME") + image_link_name = d.getVar("IMAGE_LINK_NAME") + imgdeploydir = Path(d.getVar("SPDXIMAGEDEPLOYDIR")) + machine = d.getVar("MACHINE") + + spdx_path = imgdeploydir / (image_name + ".spdx.json") + + root_elements = [] + + # TODO: Do we need to add the rootfs or are the image files sufficient? + rootfs_image, _ = oe.sbom30.find_root_obj_in_jsonld( + d, + "rootfs", + "%s-%s" % (image_basename, machine), + oe.spdx30.software_Package, + # TODO: Should use a purpose here? + ) + root_elements.append(rootfs_image._id) + + image_objset, _ = oe.sbom30.find_jsonld( + d, "image", "%s-%s" % (image_basename, machine), required=True + ) + for o in image_objset.foreach_root(oe.spdx30.software_File): + root_elements.append(o._id) + + objset, sbom = oe.sbom30.create_sbom(d, image_name, root_elements) + + oe.sbom30.write_jsonld_doc(d, objset, spdx_path) + + def make_image_link(target_path, suffix): + if image_link_name: + link = imgdeploydir / (image_link_name + suffix) + if link != target_path: + link.symlink_to(os.path.relpath(target_path, link.parent)) + + make_image_link(spdx_path, ".spdx.json") + + +def sdk_create_spdx(d, sdk_type, spdx_work_dir, toolchain_outputname): + sdk_name = toolchain_outputname + "-" + sdk_type + sdk_packages = oe.sdk.sdk_list_installed_packages(d, sdk_type == "target") + + objset = oe.sbom30.ObjectSet.new_objset(d, sdk_name) + + sdk_rootfs = objset.add_root( + oe.spdx30.software_Package( + _id=objset.new_spdxid("sdk-rootfs", sdk_name), + creationInfo=objset.doc.creationInfo, + name=sdk_name, + software_primaryPurpose=oe.spdx30.software_SoftwarePurpose.archive, + ) + ) + set_timestamp_now(d, sdk_rootfs, "builtTime") + + sdk_build = objset.add_root(objset.new_task_build("sdk-rootfs", "sdk-rootfs")) + set_timestamp_now(d, sdk_build, "build_buildEndTime") + + objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + [sdk_rootfs], + ) + + collect_build_package_inputs(d, objset, sdk_build, sdk_packages) + + objset.add_aliases() + oe.sbom30.write_jsonld_doc(d, objset, spdx_work_dir / "sdk-rootfs.spdx.json") + + +def create_sdk_sbom(d, sdk_deploydir, spdx_work_dir, toolchain_outputname): + # Load the document written earlier + rootfs_objset = oe.sbom30.load_jsonld( + d, spdx_work_dir / "sdk-rootfs.spdx.json", required=True + ) + + # Create a new build for the SDK installer + sdk_build = rootfs_objset.new_task_build("sdk-populate", "sdk-populate") + set_timestamp_now(d, sdk_build, "build_buildEndTime") + + rootfs = rootfs_objset.find_root(oe.spdx30.software_Package) + if rootfs is None: + bb.fatal("Unable to find rootfs artifact") + + rootfs_objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasInputs, + oe.spdx30.LifecycleScopeType.build, + [rootfs], + ) + + files = set() + root_files = [] + + # NOTE: os.walk() doesn't return symlinks + for dirpath, dirnames, filenames in os.walk(sdk_deploydir): + for fn in filenames: + fpath = Path(dirpath) / fn + if not fpath.is_file() or fpath.is_symlink(): + continue + + relpath = str(fpath.relative_to(sdk_deploydir)) + + f = rootfs_objset.new_file( + rootfs_objset.new_spdxid("sdk-installer", relpath), + relpath, + fpath, + ) + set_timestamp_now(d, f, "builtTime") + + if fn.endswith(".manifest"): + f.software_primaryPurpose = oe.spdx30.software_SoftwarePurpose.manifest + elif fn.endswith(".testdata.json"): + f.software_primaryPurpose = ( + oe.spdx30.software_SoftwarePurpose.configuration + ) + else: + set_purposes(d, f, "SPDX_SDK_PURPOSE") + root_files.append(f) + + files.add(f) + + if files: + rootfs_objset.new_scoped_relationship( + [sdk_build], + oe.spdx30.RelationshipType.hasOutputs, + oe.spdx30.LifecycleScopeType.build, + files, + ) + else: + bb.warn(f"No SDK output files found in {sdk_deploydir}") + + objset, sbom = oe.sbom30.create_sbom( + d, toolchain_outputname, sorted(list(files)), [rootfs_objset] + ) + + oe.sbom30.write_jsonld_doc( + d, objset, sdk_deploydir / (toolchain_outputname + ".spdx.json") + ) From patchwork Fri Jul 12 15:58:21 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46269 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 2A7D8C41513 for ; Fri, 12 Jul 2024 16:03:33 +0000 (UTC) Received: from mail-oa1-f50.google.com (mail-oa1-f50.google.com [209.85.160.50]) by mx.groups.io with SMTP id smtpd.web11.11597.1720800207761906880 for ; Fri, 12 Jul 2024 09:03:27 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=cAceWUEd; spf=pass (domain: gmail.com, ip: 209.85.160.50, mailfrom: jpewhacker@gmail.com) Received: by mail-oa1-f50.google.com with SMTP id 586e51a60fabf-25d6dd59170so1149966fac.0 for ; Fri, 12 Jul 2024 09:03:27 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800206; x=1721405006; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=YLifC1TxjruMonFT8U23tJUUqGaBbLMn1FCOnDYPeY8=; b=cAceWUEdJUUKyq8rfDLPuBIPi+xRY5WkKb/IrgpZY5q+34lGZLLxZ+RKAq7NudTFOx x8mprjw49XpEJZR2YxcUpSv6o3naXE2Q9Mao9YSsjk+RdIsEAcRjNwIKCFsKOMf5/Hqf qUYsewZZERbYdGAvm0OKLLqK0cUNGUeQ8UfwIeGfHAnJqyb96H3U17UhoGmXKXjcY5b0 zQG+JywhPRRZAoE2EwY81H39htDNa88W9YOb0lJx1EN588CBoKLYfMGnwR+sE6CqjJAQ /FwKb8PP/4Mp0qChMTg3GWVFmMZmYsOVtcFMNYqmWJtfLyR3cCARoM3WSjwcCCCusAaY lDtA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800206; x=1721405006; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=YLifC1TxjruMonFT8U23tJUUqGaBbLMn1FCOnDYPeY8=; b=QEY7hhxop8krFsvOJCLzZXkeaIHA/y8+sRyRJTHRINsYsq8cpGYTCOzLBmRdn7Wum7 Vnbvr6NGkPj6nx4AAyWCSzA+IEsfNW5V80TcRt1Tw8QLyhMeLUv2gp+uqzQgKjDxMcEy o8aeuQKIcg7jnbpQMsfwXaRlWWAWzuzxvq6F50qxo8jOmwisYm9YJD7fAUcl0ZOVpUMN rJECi9I2vm52zHYaXjLVkGtYTcYv6QsPFQQc6L7JH9GVdnyqUZSFTiltzb/lUmu1iI1W XIZ4XdJiOAqT8SC5jgQZUcSFeY3x6S2kUUT1SodQcnf+L5e2Db8CZglLXAw0R0xl9YKF cCZQ== X-Gm-Message-State: AOJu0Yy8l3sfpkMb/jdMwB9Q0N4UBNh/vpyw6bmf2JXayPmLpoUjtbbG ntcfuhqoQUT3+kZ8Gc0z7ojmEL51YOz2vlXHmi2IV6cmzlpr7FecESjckw== X-Google-Smtp-Source: AGHT+IH7hcEfz0MEn0DCS2P9QTc5gtrBCxVNa3Z/gS3aZ6c8LeIhQFkfWHE4qk8uhL1LWPNr36J+WA== X-Received: by 2002:a05:6870:918d:b0:25e:1775:b02d with SMTP id 586e51a60fabf-25eae8af595mr9670032fac.32.1720800205954; Fri, 12 Jul 2024 09:03:25 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.25 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:25 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 11/12] classes/create-spdx-2.2: Handle empty packages Date: Fri, 12 Jul 2024 09:58:21 -0600 Message-ID: <20240712160304.3514496-12-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:33 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201845 When combining an SPDX document, the package list might be empty (e.g. a baremetal image). Handle this case instead of erroring out Signed-off-by: Joshua Watt --- meta/classes/create-spdx-2.2.bbclass | 83 ++++++++++++++-------------- 1 file changed, 42 insertions(+), 41 deletions(-) diff --git a/meta/classes/create-spdx-2.2.bbclass b/meta/classes/create-spdx-2.2.bbclass index 0382e4cc51a..865323d66a6 100644 --- a/meta/classes/create-spdx-2.2.bbclass +++ b/meta/classes/create-spdx-2.2.bbclass @@ -822,52 +822,53 @@ def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages, spdx doc.packages.append(image) - for name in sorted(packages.keys()): - if name not in providers: - bb.fatal("Unable to find SPDX provider for '%s'" % name) + if packages: + for name in sorted(packages.keys()): + if name not in providers: + bb.fatal("Unable to find SPDX provider for '%s'" % name) - pkg_name, pkg_hashfn = providers[name] + pkg_name, pkg_hashfn = providers[name] - pkg_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, pkg_name, pkg_hashfn) - if not pkg_spdx_path: - bb.fatal("No SPDX file found for package %s, %s" % (pkg_name, pkg_hashfn)) + pkg_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, pkg_name, pkg_hashfn) + if not pkg_spdx_path: + bb.fatal("No SPDX file found for package %s, %s" % (pkg_name, pkg_hashfn)) - pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) + pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) - for p in pkg_doc.packages: - if p.name == name: - pkg_ref = oe.spdx.SPDXExternalDocumentRef() - pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name - pkg_ref.spdxDocument = pkg_doc.documentNamespace - pkg_ref.checksum.algorithm = "SHA1" - pkg_ref.checksum.checksumValue = pkg_doc_sha1 + for p in pkg_doc.packages: + if p.name == name: + pkg_ref = oe.spdx.SPDXExternalDocumentRef() + pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name + pkg_ref.spdxDocument = pkg_doc.documentNamespace + pkg_ref.checksum.algorithm = "SHA1" + pkg_ref.checksum.checksumValue = pkg_doc_sha1 - doc.externalDocumentRefs.append(pkg_ref) - doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) - break - else: - bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) - - runtime_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, "runtime-" + name, pkg_hashfn) - if not runtime_spdx_path: - bb.fatal("No runtime SPDX document found for %s, %s" % (name, pkg_hashfn)) - - runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) - - runtime_ref = oe.spdx.SPDXExternalDocumentRef() - runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name - runtime_ref.spdxDocument = runtime_doc.documentNamespace - runtime_ref.checksum.algorithm = "SHA1" - runtime_ref.checksum.checksumValue = runtime_doc_sha1 - - # "OTHER" isn't ideal here, but I can't find a relationship that makes sense - doc.externalDocumentRefs.append(runtime_ref) - doc.add_relationship( - image, - "OTHER", - "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), - comment="Runtime dependencies for %s" % name - ) + doc.externalDocumentRefs.append(pkg_ref) + doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) + break + else: + bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) + + runtime_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, "runtime-" + name, pkg_hashfn) + if not runtime_spdx_path: + bb.fatal("No runtime SPDX document found for %s, %s" % (name, pkg_hashfn)) + + runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) + + runtime_ref = oe.spdx.SPDXExternalDocumentRef() + runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name + runtime_ref.spdxDocument = runtime_doc.documentNamespace + runtime_ref.checksum.algorithm = "SHA1" + runtime_ref.checksum.checksumValue = runtime_doc_sha1 + + # "OTHER" isn't ideal here, but I can't find a relationship that makes sense + doc.externalDocumentRefs.append(runtime_ref) + doc.add_relationship( + image, + "OTHER", + "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), + comment="Runtime dependencies for %s" % name + ) bb.utils.mkdirhier(spdx_workdir) image_spdx_path = spdx_workdir / (rootfs_name + ".spdx.json") From patchwork Fri Jul 12 15:58:22 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Joshua Watt X-Patchwork-Id: 46270 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id 4D074C3DA55 for ; Fri, 12 Jul 2024 16:03:33 +0000 (UTC) Received: from mail-oo1-f54.google.com (mail-oo1-f54.google.com [209.85.161.54]) by mx.groups.io with SMTP id smtpd.web10.11483.1720800209291954500 for ; Fri, 12 Jul 2024 09:03:29 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=PYzhbpdi; spf=pass (domain: gmail.com, ip: 209.85.161.54, mailfrom: jpewhacker@gmail.com) Received: by mail-oo1-f54.google.com with SMTP id 006d021491bc7-5cbc5b63939so914636eaf.1 for ; Fri, 12 Jul 2024 09:03:29 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1720800208; x=1721405008; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=dZLoqttQQd0sTcjjmtleS0ipsYhkshkbTe3nrVGzReY=; b=PYzhbpdi1xpX1OMoZUtxywZjTk9NafRd3Oq89bllz575Buj44DJrnmv6FFpYZZwzYi q35tmyVWKEg4w0nzo+Gly9qaajBgxXt9YES2hYb7r1pDrSLM5NI0R+YTqjYKW5/VqPAa WhPllJMgXS/H84I7SqHHMVqlxksO54sBLJyWZoZPwkic3zDlnJzK6THyUuuWUwWhr5mh WBH2e8OPAk7QXo8jpNOvUXE57uwM47VxUkjRhIFKuqbSZWqkbcy0JvkDil6NSuLp9+J2 GeHd3l68M29aaHiXVA5z9IHQkivvbfi4WSuLNPEn4X5ob2RquR8Nk0zq1U4YHbQ40PfR OE3g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1720800208; x=1721405008; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=dZLoqttQQd0sTcjjmtleS0ipsYhkshkbTe3nrVGzReY=; b=KusDgPfpBCXlOtbzpyJPBNjjp1WMciaV4SSx05MePmc2fXz6QFoI9cskF2zvo5TzBK jox9S50++lo8xbgpp+ECiu4bwweeVpd/OGpZpAFeDR9GtI23j+iqQq575/28I6HPJPMi gJby9v53lFd8/OSgP8/FI3j8p0d7cfJTXac/bw8gmfajHFAKD7lB2cE3frJoIlR37KxB BH7ButYyufJyTYqIR+1O1CvoPUyPKmyAn5PYADJaKYsajX9/CNoxDX0WgQo2ZjHiUASo 6wQTLYa2RVkPpCB1a+h757c/cIKKur2tpkkMzOHU1etOyJe04z0M8AheqaGvrDwaYWFQ e/mQ== X-Gm-Message-State: AOJu0YzZj8woZbJOGsSVYu+hz7iZGIkWvlQWqjm25rFgeFjJmwMYGFxZ GHkARpnsp9QY0Vcd//k7xKHlNkBILLw/juzx1B1phyO86teuTAE84l1MeQ== X-Google-Smtp-Source: AGHT+IHG9qmCw6f8+pnJrp4xWxdPePDak4ioFKAczkznh8E36JjLPuclsvfay6qLCV2RGcd7brGLdw== X-Received: by 2002:a05:6870:310e:b0:25d:f8fa:b53a with SMTP id 586e51a60fabf-25eae7ce03emr10657049fac.9.1720800207905; Fri, 12 Jul 2024 09:03:27 -0700 (PDT) Received: from localhost.localdomain ([2601:282:4300:19e0::4a71]) by smtp.gmail.com with ESMTPSA id 586e51a60fabf-25eaa29d16dsm2267694fac.53.2024.07.12.09.03.26 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 12 Jul 2024 09:03:26 -0700 (PDT) From: Joshua Watt X-Google-Original-From: Joshua Watt To: openembedded-core@lists.openembedded.org Cc: Joshua Watt Subject: [OE-core][PATCH v6 12/12] Switch default spdx version to 3.0 Date: Fri, 12 Jul 2024 09:58:22 -0600 Message-ID: <20240712160304.3514496-13-JPEWhacker@gmail.com> X-Mailer: git-send-email 2.45.2 In-Reply-To: <20240712160304.3514496-1-JPEWhacker@gmail.com> References: <20240703140059.4096394-1-JPEWhacker@gmail.com> <20240712160304.3514496-1-JPEWhacker@gmail.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Fri, 12 Jul 2024 16:03:33 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/201846 Changes the default SPDX version to 3.0 Signed-off-by: Joshua Watt --- meta/classes/create-spdx.bbclass | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/meta/classes/create-spdx.bbclass b/meta/classes/create-spdx.bbclass index 19c6c0ff0b9..b604973ae0a 100644 --- a/meta/classes/create-spdx.bbclass +++ b/meta/classes/create-spdx.bbclass @@ -5,4 +5,4 @@ # # Include this class when you don't care what version of SPDX you get; it will # be updated to the latest stable version that is supported -inherit create-spdx-2.2 +inherit create-spdx-3.0