Create query

To use Snyk Code custom rules to create queries with suggestive AI support, you can choose from the provided templates and predicates. Alternatively, you can create your own predicates and save them as a custom rule.

Consider the following query examples and rules to use with Snyk Code custom rules. A CWE 312 query example is provided on this page.

Simple syntactical query

Copy the following source code snippet in the snippet window and select C# as the language

It is only a snippet and not a full program. It will not compile.

// Read request body
string body;
using (var reader = new StreamReader(context.Request.Body))
{
   body = await reader.ReadToEndAsync();
}
// Parse JSON data
var form = JsonConvert.DeserializeObject<SignupForm>(body);
var sql = String.Format("INSERT INTO submissions(email, name) VALUES('%s', '%s')", form.Email, form.Name);
form.Email = "nobody@notrealdomain.co.uk";
using var cmd = new NpgsqlCommand(sql, conn);

Running the query

Enter the following queries in the query window and press Run Query to see the results.

  1. Select body by using the query: “body”

This query does not select the Body with a capital B. The query language is case-sensitive.

  1. Add Body to the findings so the query becomes Or<”body”,”Body”>.

  2. You can achieve the same outcome using a regex ~"body|Body" or ~"[Bb]ody"

  3. Do something more complex regex and query: ~"[a-z0-9!#$%&'*+/=?^_{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?" It matches the hardcoded email address.

Try it yourself

Run the following query over your code ~"([a-zA-Z0-9+/]{40})" If you find something, check it out first, as you might leak your AWS secrets.

If you are interested in a certain type of object, you can use templates. For example, the query CallExpression<"Format"> matches a function call or Literal<"nobody@notrealdomain.co.uk"> matches the string with the email address.

A data flow or taint analysis

For this example, a JavaScript code snippet is used. You can copy it in the snippet window and select JavaScript.

const express = require('express');
const bodyParser = require('body-parser');
const { Client } = require('pg');
const fs = require('fs');


const app = express();
app.use(bodyParser.json());


const client = new Client({
   host: 'localhost',
   user: 'youruser',
   password: 'yourpassword',
   database: 'yourdbname'
});


async function connectDb(client) {
   await client.connect();
}


async function insertSubmission(client, email, name) {
   await client.query(`INSERT INTO submissions(email, name) VALUES(${email}, ${name})`);
}


function logSubmission(email, name) {
   const logMessage = `New submission: Email=${email}, Name=${name}\n`;
   fs.appendFileSync('myapp.log', logMessage);
}


app.post('/signup', async (req, res) => {
   try {
       const { email, name } = req.body;
       await insertSubmission(client, email, name);
       logSubmission(email, name);
       res.send({ message: 'Signup successful!' });
   } catch (err) {
       console.error(err);
       res.status(500).send({ message: 'An error occurred.' });
   }
});


connectDb(client).then(() => {
   app.listen(3000, () => console.log('Server is running on port 3000'));
});

Snyk Code knows a list of possible sources of external data in the predicate PRED:AnySource. The following query shows you that app.post() is identified.

Query PRED:SqliSinks shows you that query() is part of that list of SQL injection sinks. The query engine comes with many different predicates for various source, sink, and sanitizer types. Check the list of predicates to see them all.

To check whether the data flows into a SQL injection sink, use the following: DataFlowsInto<PRED:SqliSink>. It shows you that in the program, data from the req parameter flow into query() taking several turns.

If the data flow is also going through a sanitizer, you can use a specialized template. Change the query to ​​Taint<PRED:AnySource, PRED:SqliSanitizer, PRED:SqliSink>

There is nothing language-specific in the query. It would work on similar code in other languages.

Net new data flow rule

Create a new rule because Snyk is not aware of the proprietary source built in-house, resulting in missed findings.

Use a data flow template known as Taint when creating a data flow query.

Taint<PRED:"SourceFoo",PRED:XssSanitizer,PRED:XssSink>

You can configure the following parameters:

  • Source: The first parameter indicates where the data flow starts.

  • Sanitizer: The second parameter indicates a known sanitizer that would sanitize the data, resulting in the data's not being tainted

  • Sink: The third parameter indicating where the data flow ends

Custom predicates are indicated by writing their names within brackets. In this scenario, the custom method is called SourceFoo.

With this query, you can look for the data flow that originates in SourceFoo. A source unknown to Snyk ends up in a known vulnerable cross-site scripting (XSS) Sink and does not pass through a known cross-site scripting (XSS) Sanitizer. Therefore, the assumption is that the data is tainted.

Extend a data flow rule

Recreate a Snyk rule and add a source to the current Snyk known vulnerable source list because they are not being taken into account in the scans, resulting in missed vulnerabilities.

Like the Net new data flow rule, the Taint data flow template is used with an Or operator. Operators are available to create logical statements for your queries, such as Or or And.

Run the data flow rule using both the Snyk known sources but also a custom source called .

Taint<Or<PRED:AnySource,"SourceFoo">,PRED:XssSanitizer,PRED:XssSink>

With this query, you look for the data flow that originates in a Snyk known source OR “SourceFoo” . A source unknown to Snyk ends up in a known vulnerable cross-site scripting (XSS) Sink and does not pass through a known cross-site scripting (XSS) Sanitizer. Therefore, the assumption is that the data is tainted.

Any statement that uses an operator will be written within angle brackets < statement >.

Context added to data flow rule