Best Practices & FAQ
Guidelines for writing effective test cases and answers to common questions.
Best Practices
Follow these guidelines to create a test suite that is easy to maintain, execute, and scale as your product grows.
Write clear, descriptive titles
A good test case title should tell the reader what feature is being tested and what specific scenario is covered — without needing to open the test case. Use the pattern: [Feature Area] - [Specific Scenario].
| Bad title | Good title |
|---|---|
Test login | Login - Valid credentials redirect to dashboard |
Cart test | Checkout - Remove item from cart updates total |
API test | API - GET /users returns 401 without auth token |
Use preconditions wisely
Preconditions describe the state of the system before testing begins — not the first step of the test. Use preconditions for things like "User must be logged in as admin" or "Feature flag X is enabled." If the setup involves actions the tester must perform, put those as test steps instead.
| Precondition (system state) | Test step (tester action) |
|---|---|
| "User account exists with role = Admin" | "Log in with admin credentials" |
| "Test database seeded with sample products" | "Navigate to the Products page" |
Keep steps atomic
Each step should describe one action and one expected result. Avoid combining multiple actions in a single step. Atomic steps make it easier to pinpoint exactly where a failure occurred and produce more meaningful execution reports.
| Too broad (multiple actions) | Atomic (one action each) |
|---|---|
| "Log in, navigate to settings, change the password and verify the success message" | Step 1: "Log in with valid credentials" / Step 2: "Navigate to Settings > Security" / Step 3: "Enter new password and confirm" / Step 4: "Verify success message is displayed" |
Leverage labels for cross-cutting categorization
Folders give you a primary organizational hierarchy, but labels add a secondary dimension. Use labels
for things that cut across folders: sprint numbers (sprint-42), risk levels (high-risk),
platforms (mobile, desktop), or compliance requirements (GDPR, SOC2).
Set Automation IDs for CI/CD integration
If you use automated testing, populate the Automation ID field with the identifier of the corresponding automated test (e.g., the test function name or test file path). This enables TestKase to automatically match CI/CD results to your test cases. See the Automation documentation for full setup instructions.
Review and maintain test cases regularly
Test cases go stale as your product evolves. Schedule periodic reviews (e.g., once per sprint or release) to:
- Update steps that no longer match the current UI or API behavior.
- Deprecate test cases for features that have been removed.
- Add new test cases for recently shipped features.
- Verify that preconditions and expected results are still accurate.
- Consolidate duplicate or overlapping test cases.
FAQ
▶Can I change the Test Case ID after creation?
No. The Test Case ID (e.g., TC-123) is auto-generated and permanent. It serves as a stable,
immutable reference for the test case throughout its lifetime. You can always update the title and all
other fields, but the ID remains fixed.
▶What happens to execution history when I edit a test case?
Existing execution history is preserved as-is. Execution records reflect the state of the test case at the time of execution. When you modify steps or fields, future executions will use the updated version, but past results remain unchanged. This gives you an accurate historical record even as the test case evolves.
▶How many test cases can I have in a project?
There is no hard limit on the number of test cases per project. TestKase is designed to handle test suites of any size — from a few dozen to tens of thousands of test cases. Performance optimizations like pagination, search indexing, and lazy loading ensure the interface remains responsive regardless of suite size. Storage limits for attachments depend on your plan.
▶Can I move a test case to a different project?
Currently, test cases cannot be moved between projects directly. However, you can export test cases from one project as CSV and import them into another project. The imported test cases will receive new IDs in the target project. Execution history, change history, and attachments are not transferred during CSV import/export.
▶What is the difference between Status (Draft/Active/Deprecated) and execution results (Passed/Failed)?
These are two different concepts. Status is a property of the test case itself — it indicates
whether the test case is ready to be used (Active), still being written (Draft), or
no longer relevant (Deprecated). Execution results (Passed, Failed, Blocked,
Skipped, Not Run) are recorded when the test case is run inside
a test cycle and
reflect the outcome of that specific run.
▶How do I link a test case to a Jira issue?
If you have the Jira integration enabled, defects created from failed test case executions can be automatically synced to Jira as issues. You can also manually link Jira issue keys in the defect record. See the Jira Integration documentation for complete setup instructions.
▶Can I create test cases from an AI-generated suggestion?
Yes. TestKase includes AI features that can generate test case suggestions based on your requirements or existing test cases. AI-generated test cases are created as drafts so you can review and refine them before marking them as active. See the AI Features documentation for details.
▶How do I delete a test case that has execution history?
You can delete any test case regardless of its execution history. When you delete a test case, all associated data is permanently removed — including execution records, change history, defect links, and attachments. If you want to preserve the historical record but stop using the test case, consider setting its status to Deprecated instead of deleting it.