JSON document database
Tero is a JSON database that provides ACID transactions, schema validation, automated cloud backup, and automatic cloud recovery.
- Atomicity: All-or-nothing transactions ensure data consistency
- Consistency: Schema validation and business rule enforcement
- Isolation: Concurrent operations are properly isolated
- Durability: Write-ahead logging ensures data survives system crashes
- High Performance: Intelligent caching and batch operations
- Data Integrity: Built-in corruption detection and recovery
- Schema Validation: Flexible schema system with strict mode
- Error Handling: Comprehensive error handling and recovery
- Memory Management: Efficient memory usage with automatic cleanup
- Cloud Backup: AWS S3 and Cloudflare R2 support
- Data Recovery: Automatic crash recovery and cloud restore
- Monitoring: Performance metrics and health checks
- Security: Path traversal protection and input validation
npm install teroimport { Tero } from 'tero'; // Initialize database const db = new Tero({ directory: './mydata', cacheSize: 1000 }); // Basic operations await db.create('user1', { name: 'Alice', email: 'alice@example.com' }); const user = await db.get('user1'); await db.update('user1', { age: 30 }); await db.remove('user1');All basic operations are automatically wrapped in ACID transactions:
// These operations are automatically ACID-compliant await db.create('account', { balance: 1000 }); await db.update('account', { balance: 1500 });For complex operations requiring multiple steps:
const txId = db.beginTransaction(); try { await db.write(txId, 'account1', { balance: 900 }); await db.write(txId, 'account2', { balance: 1100 }); // Verify within transaction const account1 = await db.read(txId, 'account1'); await db.commit(txId); } catch (error) { await db.rollback(txId); throw error; }Demonstrates ACID properties with business logic:
// Atomic money transfer with validation await db.transferMoney('savings', 'checking', 500);Define and enforce data schemas:
// Set schema db.setSchema('users', { name: { type: 'string', required: true, min: 2, max: 50 }, email: { type: 'string', required: true, format: 'email' }, age: { type: 'number', min: 0, max: 150 }, profile: { type: 'object', properties: { bio: { type: 'string', max: 500 }, website: { type: 'string', format: 'url' } } } }); // Create with validation await db.create('user1', userData, { validate: true, schemaName: 'users', strict: true });Efficient batch processing with ACID guarantees:
// Batch write await db.batchWrite([ { key: 'product1', data: { name: 'Laptop', price: 999.99 } }, { key: 'product2', data: { name: 'Mouse', price: 29.99 } }, { key: 'product3', data: { name: 'Keyboard', price: 79.99 } } ]); // Batch read const products = await db.batchRead(['product1', 'product2', 'product3']);Configure automatic cloud backups:
db.configureBackup({ format: 'archive', cloudStorage: { provider: 'aws-s3', region: 'us-east-1', bucket: 'my-backup-bucket', accessKeyId: process.env.AWS_ACCESS_KEY_ID, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY }, retention: '30d' }); // Perform backup const result = await db.performBackup();Automatic crash recovery and cloud restore:
// Configure data recovery db.configureDataRecovery({ cloudStorage: cloudConfig, localPath: './mydata' }); // Recover specific file await db.recoverFromCloud('important-data'); // Recover all files const result = await db.recoverAllFromCloud();Generate MongoDB ObjectId-like unique identifiers:
// Generate unique IDs for different purposes const userId = db.getNewId('user'); // e.g., "user-507f1f77bcf86cd799439011" const sessionId = db.getNewId('session'); // e.g., "session-507f1f77bcf86cd799439012" const orderId = db.getNewId('order'); // e.g., "order-507f1f77bcf86cd799439013" // Use as document keys await db.create(userId, { name: 'Alice', email: 'alice@example.com', createdAt: new Date() }); // IDs are guaranteed unique across processes and time const logId = db.getNewId('log'); await db.create(logId, { level: 'info', message: 'User created', userId: userId });Built-in performance monitoring and health checks:
// Cache performance (now using optimized QuickLRU) const cacheStats = db.getCacheStats(); console.log(`Cache hit rate: ${cacheStats.hitRate}%`); console.log(`Cache size: ${cacheStats.size}/${cacheStats.maxSize}`); // Data integrity check const integrity = await db.verifyDataIntegrity(); if (!integrity.healthy) { console.log(`Issues found: ${integrity.corruptedFiles.length} corrupted files`); } // Active transactions const activeTx = db.getActiveTransactions(); console.log(`Active transactions: ${activeTx.length}`);Comprehensive error handling with detailed messages:
try { await db.create('user', invalidData, { validate: true, strict: true }); } catch (error) { if (error.message.includes('Schema validation failed')) { // Handle validation error } else if (error.message.includes('already exists')) { // Handle duplicate key error } }const db = new Tero({ directory: './data', // Database directory cacheSize: 1000 // Maximum cache entries });string: Text data with length and format validationnumber: Numeric data with range validationboolean: True/false valuesobject: Nested objects with property schemasarray: Arrays with item type validationdate: Date/time valuesany: Any data type (no validation)
required: Field is mandatorymin/max: Length/value constraintsformat: Built-in formats (email, url, uuid, etc.)pattern: Regular expression validationenum: Allowed values listdefault: Default value if not providedcustom: Custom validation function
- Use batch operations for multiple documents
- Enable caching for frequently accessed data
- Use schema validation to catch errors early
- Monitor cache hit rates and adjust cache size
- Use transactions for related operations
- Path Traversal Protection: Automatic key sanitization
- Input Validation: Comprehensive data validation
- Error Handling: No sensitive data in error messages
- Access Control: File system permissions respected
create(key, data, options?): Create new documentget(key): Read documentupdate(key, data, options?): Update documentremove(key): Delete documentexists(key): Check if document exists
beginTransaction(): Start new transactionwrite(txId, key, data, options?): Write in transactionread(txId, key): Read in transactiondelete(txId, key): Delete in transactioncommit(txId): Commit transactionrollback(txId): Rollback transaction
batchWrite(operations, options?): Batch write operationsbatchRead(keys): Batch read operations
setSchema(name, schema): Define schemagetSchema(name): Get schema definitionremoveSchema(name): Remove schemavalidateData(name, data): Validate against schema
getCacheStats(): Cache performance metricsverifyDataIntegrity(): Check data healthgetActiveTransactions(): List active transactionsforceCheckpoint(): Force WAL flushclearCache(): Clear memory cachegetNewId(prefix): Generate unique identifiers with custom prefixdestroy(): Cleanup and shutdown
Run the production test suite:
npm run test:productionMIT License - see LICENSE file for details.
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
For issues and questions:
- GitHub Issues: Report bugs and request features
- Documentation: Full API documentation
Tero - Production-ready ACID JSON database for modern applications.