Custom Project questions
Last updated on • Disponible en Français
To create a custom project, navigate to the Questions page and click Create question, then select Project. This opens the project creation workflow, where you’ll design a complete coding environment tailored to your specific assessment needs.
Selecting a template
Templates are preconfigured minimal projects built around specific technology combinations, designed to serve as starting points for your custom projects.
Templates are intentionally minimal to provide a clean foundation while giving you complete flexibility to customize the environment for your specific requirements.

Editing your project in the IDE
Once you select a template, scroll down to the Project section and click Edit your project in IDE. You’ll enter a full-featured VS Code environment where you can design your project.
What’s available in the IDE
Your custom project environment includes all standard VS Code features:
- Integrated terminal with full command-line access for package installation, script execution, and system operations
- IntelliSense providing intelligent code completion, parameter hints, and error detection
- AI assistant to get help refining your project
- Extension marketplace access to install language-specific tools and productivity enhancers
- Built-in debugger supporting multiple programming languages with breakpoints, variable inspection, and call stack analysis
- Git integration for version control workflows and change tracking
Candidate instructions
Every project must include an instructions.md
file containing your problem statement and setup guidance. It uses markdown syntax. This file is automatically opened and rendered when the candidate starts the question.
Use it for:
- Problem statement and acceptance criteria
- Setup/run steps and environment notes
- Any constraints, expectations, or deliverables
coderpad/settings.json
file
The .coderpad/settings.json
file controls critical project behavior and must be configured properly for optimal candidate experience.
Example
{ "files.exclude": ["solution.md"], "workbench.defaultOpenFiles": ["src/App.tsx"], "autograding": { "runCommand": "rm -f result.tap && npm ci && npx jest --ci --reporters=default --reporters=jest-tap-reporter > result.tap", "reports": { "TAP": ["result.tap"] } } }
Code language: JSON / JSON with Comments (json)
Configuration options
- Open files: Specify which files should be open by default when candidates start the project (in addition to
instructions.md
). This helps direct their attention to starting points or key files. - Excluded files: List files and directories that won’t be included in the candidate’s project environment.
- Autograding logic: Define how automated tests should run and report results (detailed in the auto-grading section below).
Recommend extensions
Use the standard VS Code file .vscode/extensions.json
to recommend extensions. Example:
{ "recommendations": [ "Orta.vscode-jest" ] }
Code language: JSON / JSON with Comments (json)
When candidates start your project, they’ll receive notifications suggesting these extensions, helping them set up an optimal development environment quickly.
AI Assistant
The AI Assistant can be enabled or disabled at the test level through test settings. If enabled:
- Candidates see an AI assistant panel and can chat with an available model.
- AI conversations appear in playback for reviewers.
Front-end render
For web development projects, the VS Code Simple Browser module automatically renders your application, providing candidates with immediate visual feedback.
- When your development server starts on any available port, the Simple Browser opens automatically
- The Ports view in the VS Code panel shows all forwarded ports and their status
Web preview
For web development projects, a web preview component can be used to render your application, providing candidates with immediate visual feedback. To enable a web preview in your projects, configure the exposed service in the .coderpad/settings.json
file using the following fields:
mode
: Use this mode to enable the web previewopenByDefault
: Determines whether the preview opens automatically when the project startsport
: Specifies which port the preview will map toname
: Set a custom display name for your app
For example:
"exposed": [
{
"mode": "browser",
"openByDefault": true,
"port": 5173,
"name": "MyApp"
}
]
Connecting to a database
You can attach PostgreSQL or MySQL databases to any project through the Execution environment section of the question editor.
Connection credentials are provided through environment variables. For example:
username: process.env.POSTGRES_LOGIN
password: process.env.POSTGRES_PASSWORD
host: 'screen-execute-environment-node-postgres'
Git integration
Projects include a default .gitignore
to keep ephemeral or build artifacts out of version control. As you edit your project, you will be able to keep track of all changes through the Source Control module.
When you start editing your project, a branch is created. When you click Update project, your changes are automatically committed and pushed to the remote project repository.
Once you save the question, the pushed commits are merged on main, and a release is created. Advanced users can manage commits manually through VS Code’s Source Control panel.
How to save your work
- Save all files in VS Code (Ctrl/Cmd + S)
- Click Update project at the bottom of the screen
- Close the IDE
- Click Save at the bottom of the question page
⚠️ Your temporary branch containing your updated project will only be merged into the main branch once you have saved the question from the question editor.
Auto-grading
A properly configured .coderpad/settings.json
file is required for auto-grading functionality. The configuration defines how tests run and where results are stored.
Configuration example
{ "autograding": { "reports": { "TAP": ["result.tap"] }, "runCommand": "rm -f result.tap && npm ci && npx jest --ci --reporters=default --reporters=jest-tap-reporter > result.tap" } }
Code language: JSON / JSON with Comments (json)
Run command requirements
The runCommand
must be designed to work in a fresh container environment and should:
- Install dependencies: Use
npm ci
,pip install -r requirements.txt
, or equivalent - Execute tests: Run your testing framework with appropriate reporters
- Generate reports: Output results in TAP or JUNIT format
- Handle cleanup: Remove old result files to prevent conflicts
Supported report formats
Your test command should generate test reports in either TAP or JUnit format. The reports
variable defines where test reports are written and in which format (TAP
or JUNIT
).
Evaluation criteria configuration
Once you’ve configured your settings.json
file, you can customize how automated tests affect your reports through the Evaluation criteria section in the question editor.
In the Automatic section, click Sync from project to import test cases from your project. The system will boot a fresh environment, run your runCommand
, parse the generated reports, and display individual test cases for configuration.
For each imported test, you can customize:
- Label:
- Provides a human-readable description displayed in reports
- Helps reviewers understand what each test validates
- Skill:
- Groups points under specific skill categories (e.g., Problem Solving, Reliability)
- Each evaluation criterion contributes points to its assigned skill
- Points:
- Adjust the weight from 0 to 5 to allocate more or fewer points to each test
- Higher weights = more points allocated
- Total question points are distributed proportionally across all criteria based on their weights
Manual grading
You can complement automated testing with manual criteria for qualitative signals (e.g., code structure, readability, tests quality, security).
If manual criteria are defined, when candidates complete their test, you’ll receive an email to manually grade their work against these criteria.
Each project must include at least one evaluation criterion (automatic or manual).