name
stringlengths
0
69
severity
stringclasses
6 values
detector_id
stringlengths
0
61
category
stringclasses
3 values
cwe
listlengths
0
3
tags
listlengths
0
6
description
stringlengths
0
511
noncompliant_example
stringlengths
0
2.77k
compliant_example
stringlengths
0
1.73k
url
stringlengths
74
110
Resource management errors cdk
High
security
[ "CWE-399" ]
[ "amazon-s3", "aws-cdk", "efficiency" ]
Proper resource management is important for robust, secure applications that maintain functionality over long periods of operation.From a security perspective, exhausted resources can enable denial of service attacks and other issues if safety checks start failing.
https://docs.aws.amazon.com/amazonq/detector-library/python/resource-management-errors-cdk/
S3 partial encrypt CDK
High
security
[]
[ "amazon-s3", "aws-cdk" ]
Failing to encrypt a bucket could lead to sensitive data being exposed to unauthorized users, consider adding `S3_MANAGED` of `KMS_MANAGED` encryption while creating a bucket.
https://docs.aws.amazon.com/amazonq/detector-library/python/s3-partial-encrypt-cdk/
Missing S3 bucket owner condition
Low
security
[]
[ "amazon-s3", "aws-python-sdk", "data-integrity", "security-context" ]
Not setting the S3 bucket owner condition might introduce a risk of accidentally using a wrong bucket. For example, a configuration error could lead to accidentally writing production data into test accounts.
def verify_s3bucket_owner_noncompliant(event): import boto3 client = boto3.client('s3') # Noncompliant: missing S3 bucket owner condition # (ExpectedSourceBucketOwner). client.copy_object( Bucket=event["bucket"], CopySource=f"{event['bucket']}/{event['key']}", Key=event["key"], ExpectedBucketOwner=event["owner"], )
def verify_s3bucket_owner_compliant(event): import boto3 client = boto3.client('s3') # Compliant: sets the S3 bucket owner condition(ExpectedSourceBucketOwner). client.copy_object( Bucket=event["bucket"], CopySource=f"{event['bucket']}/{event['key']}", Key=event["key"], ExpectedBucketOwner=event["owner"], ExpectedSourceBucketOwner=event["owner2"] )
https://docs.aws.amazon.com/amazonq/detector-library/python/s3-verify-bucket-owner/
Semaphore overflow prevention
Medium
code-quality
[]
[ "concurrency" ]
When you remove an item from the `JoinableQueue` without calling `JoinableQueue.task_done()`, and then process that item, a semaphore overflow exception might be thrown. This is caused when the semaphore used to count the number of unfinished tasks overflows.
def post_tasks_noncompliant(jobs, es_url): import multiprocessing import requests jobs = multiprocessing.JoinableQueue() while True: try: # Noncompliant: fails to call JoinableQueue.task_done() # for each task removed from the JoinableQueue. image, image_name, tag = jobs.get() formatted_es_url = es_url.format(image_name) files = {'file': image.content, 'tag': tag} r = requests.post(formatted_es_url, files=files) finally: print("Task Done!!")
def post_tasks_compliant(jobs, es_url): import multiprocessing import requests jobs = multiprocessing.JoinableQueue() while True: try: image, image_name, tag = jobs.get() formatted_es_url = es_url.format(image_name) files = {'file': image.content, 'tag': tag} r = requests.post(formatted_es_url, files=files) finally: # Compliant: calls JoinableQueue.task_done() # for each task removed from the JoinableQueue. jobs.task_done()
https://docs.aws.amazon.com/amazonq/detector-library/python/semaphore-overflow-prevention/
Sensitive information leak
High
security
[ "CWE-200" ]
[ "information-leak", "owasp-top10", "top25-cwes" ]
This code might expose sensitive information to an actor who is not explicitly authorized to have access to the information. This could have serious consequences depending on the type of information revealed and how attackers can use the information.
https://docs.aws.amazon.com/amazonq/detector-library/python/sensitive-information-leak/
Server-side request forgery
High
security
[ "CWE-918" ]
[ "configuration", "injection", "networking", "owasp-top10", "top25-cwes" ]
Insufficient sanitization of potentially untrusted URLs on the server side can lead to the server issuing requests to unwanted hosts, ports, or protocols, which can bypass proxies, firewalls, and other security measures.
https://docs.aws.amazon.com/amazonq/detector-library/python/server-side-request-forgery/
[]
[]
https://docs.aws.amazon.com/amazonq/detector-library/python/severity/high/
Incorrect binding of SNS publish operations
Low
code-quality
[]
[ "amazon-sns", "availability", "aws-python-sdk" ]
Binding of SNS publish operations with `subscribe` or `create_topic` operations can cause latency issues with newly created topics.
def sns_publish_noncompliant(self, sqs_arn: str, topic_arn: str) -> None: import boto3 session = boto3.Session() sns_client = session.client('sns') sns_client.subscribe(TopicArn=topic_arn, Protocol='sqs', Endpoint=sqs_arn, ReturnSubscriptionArn=True) # Noncompliant: incorrect binding of SNS publish operations # with 'subscribe' or 'create_topic' operations. sns_client.publish(TopicArn=topic_arn, Message='test message for SQS', MessageAttributes={'attr1': { 'DataType': 'String', 'StringValue': "short_uid" } } )
def sns_publish_compliant(self, sqs_arn: str, topic_arn: str) -> None: import boto3 session = boto3.Session() sns_client = session.client('sns') response = sns_client.subscribe(TopicArn=topic_arn, Protocol='sqs', Endpoint=sqs_arn, ReturnSubscriptionArn=True) # Compliant: avoids binding of SNS publish operations # with 'subscribe' or 'create_topic' operations. return response
https://docs.aws.amazon.com/amazonq/detector-library/python/sns-no-bind-subscribe-publish-rule/
Set SNS Return Subscription ARN
Info
code-quality
[ "CWE-1228" ]
[ "amazon-sns", "aws-python-sdk" ]
The Amazon SNS subscribe operation by default returns either the subscription ARN (if the subscribed endpoint is managed by AWS and it belongs to the same account as the topic) or the phrase: `PENDING CONFIRMATION`. If you want to always return the subscription ARN, set the `ReturnSubscriptionArn` argument to `True`.
def set_return_subscription_noncompliant(self, sqs_arn: str, topic_arn: str) -> None: import botocore session = botocore.session.get_session() sns_client = session.create_client('sns', 'us-west-2') # Noncompliant: fails to set the 'ReturnSubscriptionArn' argument to # 'True' while returning the subscription ARN. sns_client.subscribe(TopicArn=topic_arn, Protocol='sqs', Endpoint=sqs_arn)
def set_return_subscription_compliant(self, sqs_arn: str, topic_arn: str) -> None: import botocore session = botocore.session.get_session() sns_client = session.create_client('sns', 'us-west-2') # Compliant: sets the 'ReturnSubscriptionArn' argument to 'True' # while returning the subscription ARN. sns_client.subscribe(TopicArn=topic_arn, Protocol='sqs', Endpoint=sqs_arn, ReturnSubscriptionArn=True)
https://docs.aws.amazon.com/amazonq/detector-library/python/sns-set-return-subscription-arn/
Unauthenticated Amazon SNS unsubscribe requests might succeed
High
security
[ "CWE-19" ]
[ "access-control", "amazon-sns", "aws-python-sdk", "data-integrity" ]
Failing to set the `AuthenticateOnUnsubscribe` flag to `True` when confirming an SNS subscription causes all unsubscribe requests to succeed, even if they are unauthenticated. Consider setting this flag to `True`.
def authenticate_on_subscribe_noncompliant(self, event) -> None: import boto3 subscriptions_failed = 0 for record in event["Records"]: message = record["body"] if message["Type"] == "SubscriptionConfirmation": try: topic_arn = message["TopicArn"] token = message["Token"] sns_client = boto3.client("sns", region_name=topic_arn.split(":")[3]) # Noncompliant: fails to set the 'AuthenticateOnUnsubscribe' # argument to 'True' while confirming an SNS subscription. sns_client.confirm_subscription(TopicArn=topic_arn, Token=token) except Exception: subscriptions_failed += 1
def authenticate_on_subscribe_compliant(self, event) -> None: import boto3 subscriptions_failed = 0 for record in event["Records"]: message = record["body"] if message["Type"] == "SubscriptionConfirmation": try: topic_arn = message["TopicArn"] token = message["Token"] sns_client = boto3.client("sns", region_name=topic_arn.split(":")[3]) # Compliant: sets the 'AuthenticateOnUnsubscribe' argument to # 'True' while confirming an SNS subscription. sns_client.confirm_subscription( TopicArn=topic_arn, Token=token, AuthenticateOnUnsubscribe='True') except Exception: subscriptions_failed += 1
https://docs.aws.amazon.com/amazonq/detector-library/python/sns-unauthenticated-unsubscribe/
Socket close platform compatibility
High
code-quality
[]
[ "availability", "networking", "resource-leak" ]
On some platforms `os.close` does not work for socket file descriptors. This is most noticeable with Windows.
def create_socket_noncompliant(samplehost, sampleport, samplebuffersize): import socket socket.socket.settimeout(10.0) socket = socket.socket() socket.connect((samplehost, sampleport)) print(socket.recv(samplebuffersize)) # Noncompliant: socket.shutdown is not called before closing the socket. socket.close()
def create_socket_compliant(samplehost, sampleport, samplebuffersize): import socket socket.socket.settimeout(10.0) socket = socket.socket() socket.connect((samplehost, sampleport)) try: print(socket.recv(samplebuffersize)) finally: # Compliant: socket.shutdown is called before closing the socket. socket.shutdown(socket.SHUT_WR) socket.close()
https://docs.aws.amazon.com/amazonq/detector-library/python/socket-close-platform-compatibility/
Socket connection timeout
Medium
security
[]
[ "availability", "networking", "resource-leak", "security-context" ]
A new Python socket by default doesn't have a timeout. Its timeout defaults to None. Not setting the connection timeout parameter can result in blocking socket mode. In blocking mode, operations block until complete or the system returns an error.
def create_socket_noncompliant(samplehost, sampleport, samplebuffersize): import socket # Noncompliant: socket timeout is not set. socket = socket.create_connection((samplehost, sampleport)) try: print(socket.recv(samplebuffersize)) finally: socket.close()
def create_socket_compliant(samplehost, sampleport, samplebuffersize): import socket # Compliant: socket timeout is set. socket = socket.create_connection((samplehost, sampleport), timeout=10) try: print(socket.recv(samplebuffersize)) finally: socket.close()
https://docs.aws.amazon.com/amazonq/detector-library/python/socket-connection-timeout/
SQL injection
High
security
[ "CWE-89" ]
[ "injection", "sql", "owasp-top10", "top25-cwes" ]
User-provided inputs must be sanitized before being used to generate a SQL database query. An attacker can create and use untrusted input to run query statements that read, modify, or delete database content.
def execute_query_noncompliant(request): import sqlite3 name = request.GET.get("name") query = "SELECT * FROM Users WHERE name = " + name + ";" with sqlite3.connect("example.db") as connection: cursor = connection.cursor() # Noncompliant: user input is used without sanitization. cursor.execute(query) connection.commit() connection.close()
def execute_query_compliant(request): import re import sqlite3 name = request.GET.get("name") query = "SELECT * FROM Users WHERE name = " + re.sub('[^a-zA-Z]+', '', name) + ";" with sqlite3.connect("example.db") as connection: cursor = connection.cursor() # Compliant: user input is sanitized before use. cursor.execute(query) connection.commit() connection.close()
https://docs.aws.amazon.com/amazonq/detector-library/python/sql-injection/
Stack trace exposure
Info
security
[ "CWE-209" ]
[ "owasp-top10" ]
It seems that you are returning a stack trace to the user. We recommend that you use exception handling and send an error message to the user.
@app_flask.route('/noncompliant/<text>') def stack_trace_exposure_noncompliant(text): try: if text == 'error': raise HTTPException return jsonify({'data': 'some_data'}), 200 except HTTPException: # Noncompliant: Stack trace is returned from api call. return traceback.format_exc()
@app_flask.route('/compliant/<text>') def stack_trace_exposure_compliant(text): try: if text == 'error': raise HTTPException return jsonify({'data': 'some_data'}), 200 except HTTPException: # Compliant: Custom json response with message as cause of error. return jsonify({'message': 'Internal error occurred!'}), 404
https://docs.aws.amazon.com/amazonq/detector-library/python/stack-trace-exposure/
Inefficient string concatenation inside loop
Info
code-quality
[]
[ "efficiency" ]
Concatenating immutable sequences results in a new object. This causes a quadratic runtime cost when done inside loop.
def string_concatenation_noncompliant(): samplelist = ['sampleString1', 'sampleString2', 'sampleString3'] concatenatedstring = '' for item in samplelist: # Noncompliant: inefficient string concatenation inside a loop is used. concatenatedstring += item + "\n" return concatenatedstring
def string_concatenation_compliant(): samplelist = ['sampleString1', 'sampleString2', 'sampleString3'] concatenatedlist = [] for item in samplelist: concatenatedlist.append(item) concatenatedlist.append("\n") # Compliant: join function is used for string concatenation concatenatedstring = ''.join(concatenatedlist) return concatenatedstring
https://docs.aws.amazon.com/amazonq/detector-library/python/string-concatenation/
Outdated subprocess module API
High
code-quality
[]
[ "maintainability", "subprocess" ]
Using outdated multiprocessing API calls to start and communicate with processes, is not recommended. The `subprocess` module can be used instead.
def subprocess_call_noncompliant(): import subprocess with open("~/output.txt", "w") as f: # Noncompliant: uses 'subprocess.call' with # 'stdout = PIPE' or 'stderr = PIPE'. subprocess.call("~/test.sh", stdout=subprocess.PIPE)
def subprocess_call_compliant(): import subprocess with open("~/output.txt", "w") as f: # Compliant: uses 'subprocess.call' without # 'stdout = PIPE' or 'stderr = PIPE'. subprocess.call("~/test.sh", stdout=f)
https://docs.aws.amazon.com/amazonq/detector-library/python/subprocess-correct-api/
Catch and swallow exception
Info
code-quality
[]
[ "maintainability" ]
Swallowing exceptions, without re-throwing or logging them, is a bad practice. The stack trace, and other useful information for debugging, is lost.
def swallow_noncompliant(): for i in range(10): try: raise ValueError() finally: # Noncompliant: uses continue or break or return statements # in finally block. continue
def swallow_compliant(): for i in range(10): try: raise ValueError() finally: # Compliant: avoids using continue or break or return statements # in finally block. print("Done with iterations")
https://docs.aws.amazon.com/amazonq/detector-library/python/swallow-exceptions/
Synchronous publication of AWS Lambda metrics
High
code-quality
[ "CWE-1210" ]
[ "aws-python-sdk", "aws-lambda", "efficiency" ]
AWS Lambda metrics are published synchronously. To improve efficiency, write the results to a log.
https://docs.aws.amazon.com/amazonq/detector-library/python/sync-metric-publish/
Avoid using nondeterministic Tensorflow API
Medium
code-quality
[]
[ "machine-learning", "maintainability" ]
Detects if tensorflow APIs such as `tf.compat.v1.Session` or `tf.distribute.experimental.ParameterServerStrategy` are used as they can introduce non-determinism.
def tensorflow_avoid_using_nondeterministic_api_noncompliant(): import tensorflow as tf data = tf.ones((1, 1)) # Noncompliant: Determinism of tf.compat.v1.Session # can not be guaranteed in TF2. tf.config.experimental.enable_op_determinism() tf.compat.v1.Session( target='', graph=None, config=None ) layer = tf.keras.layers.Input(shape=[1]) model = tf.keras.models.Model(inputs=layer, outputs=layer) model.compile(loss="categorical_crossentropy", metrics="AUC") model.fit(x=data, y=data)
def tensorflow_avoid_using_nondeterministic_api_compliant(): import tensorflow as tf tf.random.set_seed(0) # Compliant: uses deterministic API. tf.config.experimental.enable_op_determinism() data = tf.ones((1, 1)) layer = tf.keras.layers.Input(shape=[1]) model = tf.keras.models.Model(inputs=layer, outputs=layer) model.compile(loss="categorical_crossentropy", metrics="AUC") model.fit(x=data, y=data)
https://docs.aws.amazon.com/amazonq/detector-library/python/tensorflow-avoid-using-nondeterministic-api/
Tensorflow control sources of randomness
Medium
code-quality
[]
[ "machine-learning", "maintainability" ]
Detects if a random seed is set before random number generation. Setting a seed is important for improving reproducibility and avoiding non-determinism.
def tensorflow_control_sources_of_randomness_noncompliant(): import tensorflow as tf # Noncompliant: seed is not set. print(tf.random.uniform([1]))
def tensorflow_control_sources_of_randomness_compliant(seed): import tensorflow as tf # Compliant: sets the seed. tf.random.set_seed(seed) print(tf.random.uniform([1]))
https://docs.aws.amazon.com/amazonq/detector-library/python/tensorflow-control-sources-of-randomness/
Tensorflow enable ops determinism
Medium
code-quality
[]
[ "machine-learning", "maintainability" ]
Deterministic ops will have consistent outputs if the same inputs are ran multiple times on the same hardware.
def tensorflow_enable_op_determinism_noncompliant(): import tensorflow as tf # Noncompliant: seed is not set and doesn't use enable_op_determinism(). data = tf.ones((1, 1)) layer = tf.keras.layers.Input(shape=[1]) model = tf.keras.models.Model(inputs=layer, outputs=layer) model.compile(loss="categorical_crossentropy", metrics="AUC") model.fit(x=data, y=data)
def tensorflow_enable_op_determinism_compliant(): import tensorflow as tf # Compliant: sets the seed and enable_op_determinism() is used. tf.keras.utils.set_random_seed(1) tf.config.experimental.enable_op_determinism() data = tf.ones((1, 1)) layer = tf.keras.layers.Input(shape=[1]) model = tf.keras.models.Model(inputs=layer, outputs=layer) model.compile(loss="categorical_crossentropy", metrics="AUC") model.fit(x=data, y=data)
https://docs.aws.amazon.com/amazonq/detector-library/python/tensorflow-enable-op-determinism/
Tensorflow redundant softmax
Medium
code-quality
[]
[ "machine-learning", "correctness" ]
Computing the cross entropy loss directly from logits using the `softmax_cross_entropy_with_logits` is numerically more stable than computing a softmax and then the cross entropy. The improvement comes from the internal use of the log-sum-exp trick.
def tensorflow_redundant_softmax_noncompliant(): import tensorflow as tf logits = [[4.0, 2.0, 1.0], [0.0, 5.0, 1.0]] labels = [[1.0, 0.0, 0.0], [0.0, 0.8, 0.2]] # Noncompliant: using `tf.nn.softmax` with # `tf.nn.softmax_cross_entropy_with_logits` is redundant. tf.nn.softmax_cross_entropy_with_logits( labels=labels, logits=tf.nn.softmax(logits))
def tensorflow_redundant_softmax_compliant(): import tensorflow as tf logits = [[4.0, 2.0, 1.0], [0.0, 5.0, 1.0]] labels = [[1.0, 0.0, 0.0], [0.0, 0.8, 0.2]] # Compliant: unscaled `logits` is passed directly # to `tf.nn.softmax_cross_entropy_with_logits`. tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits)
https://docs.aws.amazon.com/amazonq/detector-library/python/tensorflow-redundant-softmax/
Unnecessary iteration
Info
code-quality
[]
[ "efficiency" ]
Iteration when only one item is needed from list is inefficient and can make your code difficult to read.
def find_string_noncompliant(): data = set(["sampleString1", "sampleString2", "sampleString3"]) # Noncompliant: a loop is used to access a single item. for i in data: if i == "sampleString1": print("found item")
def find_string_compliant(): data = set(["sampleString1", "sampleString2", "sampleString3"]) # Compliant: a loop is not used to access a single item. if "sampleString1" in data: print("found item")
https://docs.aws.amazon.com/amazonq/detector-library/python/unnecessary-iteration/
Unrestricted upload of dangerous file type
High
security
[ "CWE-434" ]
[ "injection", "owasp-top10", "top25-cwes" ]
Insufficiently restricted file uploads can allow a file to be uploaded that runs malicious code. For example, a website that doesn't check the file extension of an image can be exploited by uploading a script with an extension, such as `.php` or `.asp`, that can be run on the server.
from flask import app @app.route('/', methods=['GET', 'POST']) def file_upload_non_compliant(): import os from flask import request upload_file = request.files['file'] # Noncompliant: the uploaded file can have any extension. upload_file.save(os.path.join('/path/to/the/uploads', upload_file.filename))
from flask import app @app.route('/', methods=['GET', 'POST']) def file_upload_compliant(): import os from flask import request extensions = {'txt', 'pdf', 'png', 'jpg', 'jpeg', 'gif'} upload_file = request.files['file'] # Compliant: the uploaded file must have one of the allowed extensions. if '.' in upload_file.filename and \ upload_file.filename.split('.')[-1] in extensions: upload_file.save(os.path.join('/path/to/the/uploads', upload_file.filename))
https://docs.aws.amazon.com/amazonq/detector-library/python/unrestricted-file-upload/
Unsafe Cloudpickle Load
High
security
[ "CWE-502" ]
[ "deserialization", "owasp-top10", "top25-cwes" ]
Detects the usage of cloudpickle.load for deserializing data from a file, which can lead to insecure deserialization vulnerabilities.
https://docs.aws.amazon.com/amazonq/detector-library/python/unsafe-cloudpickle-load/
Untrusted AMI images
Medium
security
[ "CWE-349" ]
[ "amazon-ec2", "aws-python-sdk", "injection" ]
The code requests Amazon Machine Images (AMIs) by name, without filtering them by owner or AMI identifiers. The response might contain untrusted public images from other accounts. Launching an AMI from an untrusted source might inadvertently run malicious code.
def image_filter_non_compliant(): import boto3 ec2 = boto3.resource('ec2') image_name = 'The name of the AMI (provided during image creation)' # Noncompliant: requests Amazon Machine Images (AMIs) with # only name filter ignoring owner or AMI identifiers. filters = [{'Name': 'name', 'Values': [image_name]}] images = ec2.images.filter(Filters=filters)
def image_filter_compliant(): import boto3 ec2 = boto3.resource('ec2') image_name = 'The name of the AMI (provided during image creation)' owner_id = 'The AWS account ID of the owner' # Compliant: requests Amazon Machine Images (AMIs) with # both name and owner-id filters. filters = [ {'Name': 'name', 'Values': [image_name]}, {'Name': 'owner-id', 'Values': [owner_id]} ] images = ec2.images.filter(Filters=filters)
https://docs.aws.amazon.com/amazonq/detector-library/python/untrusted-ami-images/
Deserialization of untrusted object
High
security
[ "CWE-502" ]
[ "deserialization", "injection", "owasp-top10", "top25-cwes" ]
Deserialization of untrusted or potentially malformed data can be exploited for denial of service or to induce running untrusted code.
def untrusted_deserialization_noncompliant(): import jsonpickle userobj = input("user") # Noncompliant: Untrusted object deserialized without validation. obj = jsonpickle.decode(userobj) return obj
def untrusted_deserialization_compliant(): import jsonpickle userobj = input("user") allowed_user_obj = ['example_module1', 'example_module2'] # Compliant: Untrusted object is validated before deserialization. if userobj in allowed_user_obj: obj = jsonpickle.decode(userobj) return obj
https://docs.aws.amazon.com/amazonq/detector-library/python/untrusted-deserialization/
Use of Default Credentials CDK
High
code-quality
[ "CWE-1392" ]
[ "aws-cdk", "efficiency" ]
Using default keys and passwords in product design simplifies manufacturing and deployement but can lead to security risks when administrators don't change them, making it easier for attackers to breach multiple organizations.
https://docs.aws.amazon.com/amazonq/detector-library/python/use-of-default-credentials-cdk/
Use of an inefficient or incorrect API
Low
code-quality
[]
[ "efficiency", "maintainability" ]
If there are multiple APIs available to perform similar action, choose the most specialised and efficient one. This helps make your code more readable and easier to understand.
def compare_strings_noncompliant(): samplestring1 = "samplestring1" samplestring2 = "samplestring" # Noncompliant: uses find() but ignores the returned position # when nonnegative. if samplestring1.find(samplestring2) != -1: print("String match found.") else: print("String match not found.")
def compare_strings_compliant(): samplestring1 = "samplestring1" samplestring2 = "samplestring" # Compliant: uses the in operator to test for presence. if samplestring1 in samplestring2: print("String match found.") else: print("String match not found.")
https://docs.aws.amazon.com/amazonq/detector-library/python/use-of-inefficient-api/
Weak obfuscation of web request
High
security
[ "CWE-522", "CWE-202" ]
[ "owasp-top10", "top25-cwes" ]
Weak obfuscation while configuring a web request is vulnerable to unauthorized access. Using stronger obfuscation significantly reduces the chances of attacks due to unauthorized access.
def http_request_noncompliant(username, password, url): import urllib3 as urllib3 from base64 import b64encode userpass = "%s:%s" % (username, password) # Noncompliant: weak encoding used in HTTP Basic Authentication. authorization = b64encode(str.encode(userpass)).decode("utf-8") headers = {'Authorization': 'Basic %s' % authorization} urllib3.disable_warnings() http = urllib3.PoolManager() response = http.request('GET', url, headers=headers)
https://docs.aws.amazon.com/amazonq/detector-library/python/weak-obfuscation-of-request/
XML External Entity
High
security
[ "CWE-611" ]
[ "injection", "xml", "owasp-top10", "top25-cwes" ]
Objects that parse or handle XML data can lead to XML External Entity (XXE) attacks when not configured properly. Improper restriction of XML external entity processing can lead to server-side request forgery and information disclosure.
def xml_parse_noncompliant(): from lxml import etree # Noncompliant: resolve_entities is not disabled # and is set to true by default. parser = etree.XMLParser() tree1 = etree.parse('resources/xxe.xml', parser)
def xml_parse_compliant(): from lxml import etree # Compliant: resolve_entities is disabled. parser = etree.XMLParser(resolve_entities=False) tree1 = etree.parse('resources/xxe.xml', parser)
https://docs.aws.amazon.com/amazonq/detector-library/python/xml-external-entity/
XPath injection
High
security
[ "CWE-643" ]
[ "injection", "xml", "owasp-top10" ]
Potentially unsanitized user input in XPath queries can allow an attacker to control the query in unwanted or insecure ways. This might grant the attacker access to any data, not just the data that the original query intended.
from flask import request, app @app.route('/user') def find_users_noncompliant(): from flask import request import xml.etree.ElementTree as ET tree = ET.parse('users.xml') root = tree.getroot() username = request.args['username'] query = "./users/user/[@name='"+username+"']/location" # Noncompliant: evaluating expression built from user-supplied parameter # can lead to XPath injection. elements = root.findall(query) return 'Location %s' % list(elements)
from flask import request, app @app.route('/user') def find_users_compliant(): from flask import request from lxml import etree # Compliant: disabling external entities prevents XPath injection. parser = etree.XMLParser(resolve_entities=False) tree = etree.parse('users.xml', parser) root = tree.getroot() username = request.args['username'] query = "/collection/users/user[@name = $parameter_name]/location/text()" elements = root.xpath(query, parameter_name=username) return 'Location %s' % list(elements)
https://docs.aws.amazon.com/amazonq/detector-library/python/xpath-injection/
Zip bomb attack
High
security
[ "CWE-409" ]
[]
Expanding input archive files without any validation could make your code vulnerable to zip bomb attacks, which could potentially cause denial of service (DoS). We recommend that you sanitize input archive files before extracting them.
@app.route('/someUrl') def zip_bomb_attack_noncompliant(): file = request.files['file'] filename = file.filename file.save(filename) tfile = tarfile.open(filename) # Noncompliant: Untrusted archive file extracted without any validation. tfile.extractall('./tmp/') tfile.close()
@app.route('/someUrl') def zip_bomb_attack_compliant(): file = request.files['file'] filename = file.filename file.save(filename) tfile = tarfile.open(filename) threshold_entries = 100 # some threshold value # Compliant: Untrusted archive file is validated before extraction. if len(tfile.getmembers()) < threshold_entries: tfile.extractall('./tmp/') tfile.close()
https://docs.aws.amazon.com/amazonq/detector-library/python/zip-bomb-attack/