Package Exports
- devstackbackend
Readme
DevstackBackend
A comprehensive, modular Express/Mongoose backend package with built-in authentication, file uploads, rate limiting, caching, email services, push notifications, payments, and logging. Perfect for rapid backend development with production-ready features.
Features
✨ 11+ Production-Ready Services:
- 🔐 Authentication (Signup/Signin with JWT)
- 👤 User Profile Management
- 📧 Email OTP Service (with validation & rate limiting)
- 📧 General Email Service (transactional emails, bulk, templates)
- 📁 File Upload Service (single/multiple files with validation)
- 🚦 Rate Limiting Service (prevent API abuse)
- 💾 Cache Service (in-memory caching with TTL)
- 🗄️ Database CRUD Service (dynamic model creation & RESTful APIs)
- 🔔 Push Notifications (Web Push API)
- 💳 Payment Integration (Stripe & Razorpay)
- 📝 Logging Service (request/error logging with file support)
Install
npm install devstackbackendEnvironment Variables
Create a .env file with the following variables:
# Database
MONGO_URI=mongodb://localhost:27017/myapp
# Authentication
JWT_SECRET=your-super-secret-jwt-key-change-in-production
# Email (for OTP and Email services)
EMAIL_USER=your-email@gmail.com
EMAIL_PASS=your-app-password
# Push Notifications (Web Push)
VAPID_PUBLIC_KEY=your-vapid-public-key
VAPID_PRIVATE_KEY=your-vapid-private-key
# Payments - Stripe
STRIPE_SECRET_KEY=sk_test_your-stripe-secret-key
STRIPE_PUBLIC_KEY=pk_test_your-stripe-public-key
# Payments - Razorpay
RAZORPAY_KEY_ID=your-razorpay-key-id
RAZORPAY_KEY_SECRET=your-razorpay-key-secretNote: For Gmail, you'll need to use an App Password instead of your regular password.
Generate VAPID keys:
npx web-push generate-vapid-keysServices Documentation
1. Server Setup
Description
Initializes Express app with CORS, JSON parsing, and MongoDB connection.
The Server Setup service provides the foundational infrastructure for your backend application. It abstracts away the boilerplate code required to initialize an Express server with essential middleware and database connectivity. Instead of manually configuring CORS, body parsers, and MongoDB connections every time you start a project, this service handles all the setup automatically, allowing you to focus on building your application logic. It establishes a clean, production-ready foundation with cross-origin resource sharing enabled, JSON request parsing configured, and a reliable database connection established.
How it Works
import { startServer } from "devstackbackend";
// Initialize server
const { app } = await startServer({
mongoUri: process.env.MONGO_URI, // Required
port: Number(process.env.PORT) || 3000 // Optional, default: 3000
});Returns: { app, server } - Express app and HTTP server instances.
2. Authentication
Description
User signup and signin with JWT token generation.
Authentication is the process of verifying user identity, and this service implements a secure, industry-standard authentication flow. When users sign up, their credentials are validated, passwords are securely hashed using bcrypt (a one-way encryption algorithm), and stored safely in your database. During sign-in, the service verifies credentials without ever exposing plaintext passwords. Upon successful authentication, it issues a JSON Web Token (JWT) - a compact, URL-safe token that contains encoded user information. This token acts as a secure credential that clients can use to authenticate subsequent requests without repeatedly sending passwords. JWTs are stateless, meaning your server doesn't need to maintain session storage, making your application more scalable and resilient.
Features:
- ✅ Zod validation
- ✅ Bcrypt password hashing
- ✅ JWT tokens (1-day expiry)
- ✅ Email uniqueness check
How it Works
import { startSignup, startSignin } from "devstackbackend";
// Authentication
startSignup(app, process.env.JWT_SECRET);
startSignin(app, process.env.JWT_SECRET);Endpoints:
POST /signup- Body:
{ name: string, email: string, password: string } - Response:
{ token: string, user: { id, name, email } }
- Body:
POST /signin- Body:
{ email: string, password: string } - Response:
{ token: string, user: { id, name, email } }
- Body:
3. Profile Service
Description
JWT-protected user profile management.
User profiles are central to most applications - they store personal information, preferences, and account settings. The Profile Service provides a secure, self-service approach to profile management where users can view and update their own information. It implements a token-based authentication mechanism that ensures users can only access and modify their own profiles, never other users' data. The service validates incoming requests using JWT tokens (issued during authentication), extracts the user's identity, and performs operations scoped to that specific user. When updating passwords, it automatically re-hashes the new password before storage, maintaining the same security standards as the initial signup process. This creates a clean separation of concerns: authentication issues tokens, and profile management consumes them.
Features:
- ✅ JWT authentication middleware
- ✅ Secure password updates with hashing
- ✅ Profile data retrieval
How it Works
import { ProfileService } from "devstackbackend";
ProfileService(app, { jwtSecret: process.env.JWT_SECRET });Endpoints:
GET /me(requiresAuthorization: Bearer <token>)- Response:
{ id, name, email }
- Response:
PUT /me(requiresAuthorization: Bearer <token>)- Body:
{ name?: string, password?: string } - Response:
{ id, name, email }
- Body:
4. Email OTP Service
Description
Send and verify one-time passwords via email with rate limiting and validation.
One-Time Passwords (OTPs) are temporary, single-use codes sent via email to verify user identity or email ownership. This service implements a complete OTP workflow: generating random numeric codes, delivering them via email, storing them securely with expiration times, and validating them when users submit them back. The OTP acts as a second factor of authentication, adding an extra security layer beyond passwords. The service includes built-in protections against abuse: rate limiting prevents users from flooding the system with requests, expiration ensures codes can't be used indefinitely, and validation ensures only correct codes are accepted. Once verified, OTPs are immediately invalidated to prevent reuse. This is commonly used for email verification during signup, password reset flows, or two-factor authentication (2FA).
Features:
- ✅ Email validation with Zod
- ✅ Configurable OTP length (4-10 digits)
- ✅ Resend cooldown to prevent abuse
- ✅ Automatic cleanup of expired OTPs
- ✅ Beautiful HTML email templates
- ✅ Programmatic API methods
How it Works
import { EmailOtpService } from "devstackbackend";
// Email Services
EmailOtpService(app, {
emailUser: process.env.EMAIL_USER,
emailPass: process.env.EMAIL_PASS,
otpExpiry: 5, // 5 minutes
resendCooldown: 60, // 60 seconds between resends
});Endpoints:
POST /send-otp- Body:
{ email: string } - Response:
{ success: true, message: string, expiresIn: number }
- Body:
POST /verify-otp- Body:
{ email: string, otp: string } - Response:
{ verified: true, message: string }
- Body:
Programmatic Usage:
const otpService = EmailOtpService(app, config);
// Send OTP
await otpService.sendOTP("user@example.com");
// Verify OTP
const isValid = otpService.verifyOTP("user@example.com", "123456");
// Check if OTP exists
const hasOTP = otpService.hasOTP("user@example.com");5. Email Service
Description
General-purpose email sending service for transactional emails, bulk emails, and templates.
Email communication is essential for modern applications - whether sending welcome messages, password reset links, order confirmations, newsletters, or system notifications. The Email Service provides a unified, flexible interface for all email-related operations in your application. It abstracts away the complexities of SMTP configuration and email formatting, allowing you to send emails programmatically through simple API calls. The service supports multiple email patterns: single emails for transactional messages (like order confirmations), bulk emails for marketing campaigns or notifications, and template-based emails that allow dynamic content insertion (like personalizing emails with user names or order details). By centralizing email functionality, you maintain consistent email formatting, handle errors uniformly, and can easily switch email providers without changing your application code.
Features:
- ✅ Single and bulk email sending
- ✅ HTML and plain text support
- ✅ File attachments
- ✅ Template-based emails with variable substitution
- ✅ CC and BCC support
- ✅ Programmatic API
How it Works
import { EmailService } from "devstackbackend";
EmailService(app, {
emailUser: process.env.EMAIL_USER,
emailPass: process.env.EMAIL_PASS,
service: "gmail",
fromName: "My App",
});Endpoints:
POST /api/email/send- Body:
{ to: string | string[], subject: string, text?: string, html?: string, cc?: string, bcc?: string, attachments?: Array } - Response:
{ success: true, messageId: string }
- Body:
POST /api/email/send-bulk- Body:
{ emails: string[] | Array<{to, subject, text, html}>, subject: string, text?: string, html?: string } - Response:
{ success: true, total: number, successful: number, failed: number }
- Body:
POST /api/email/send-template- Body:
{ to: string, subject: string, template: string, variables: object } - Response:
{ success: true, messageId: string } - Template uses
{{variableName}}syntax for variable substitution
- Body:
Example:
// Template email
POST /api/email/send-template
{
"to": "user@example.com",
"subject": "Welcome!",
"template": "<h1>Hello {{name}}!</h1><p>Welcome to {{appName}}</p>",
"variables": { "name": "John", "appName": "MyApp" }
}6. File Upload Service
Description
Handle file uploads with validation, size limits, and storage management.
File uploads are a common requirement in web applications - users upload profile pictures, documents, images, videos, or other media files. The File Upload Service handles the complex process of receiving files from clients, validating them (checking file types, sizes, formats), securely storing them on the server, and providing access URLs. It protects your system by enforcing file size limits (preventing denial-of-service attacks), restricting file types (preventing malicious uploads), and sanitizing filenames (avoiding path traversal vulnerabilities). The service automatically generates unique filenames to prevent conflicts when multiple users upload files with the same name, and provides organized storage structure. It also offers endpoints to serve uploaded files back to clients, though in production, you'd typically use a CDN or cloud storage service for better performance and scalability.
Features:
- ✅ Automatic directory creation
- ✅ File size validation
- ✅ MIME type filtering
- ✅ File extension filtering
- ✅ Unique filename generation
- ✅ File serving endpoint
- ✅ Helper methods for programmatic use
How it Works
import { FileUploadService } from "devstackbackend";
// File Uploads
FileUploadService(app, {
uploadDir: "./uploads",
maxFileSize: 5 * 1024 * 1024, // 5MB
allowedExtensions: [".jpg", ".png", ".pdf", ".doc", ".docx"],
});Endpoints:
POST /api/upload(single file, field name:file)- Content-Type:
multipart/form-data - Response:
{ success: true, file: { filename, originalName, mimetype, size, path, url } }
- Content-Type:
POST /api/upload/multiple(multiple files, field name:files, max 10)- Content-Type:
multipart/form-data - Response:
{ success: true, count: number, files: Array }
- Content-Type:
GET /uploads/:filename(serve uploaded files)
Programmatic Usage:
const uploadService = FileUploadService(app, config);
// Use middleware in custom routes
app.post("/custom-upload", uploadService.getUploadMiddleware("avatar", 1), (req, res) => {
res.json({ file: req.file });
});
// Delete file
uploadService.deleteFile("filename.jpg");
// Get upload directory
const dir = uploadService.getUploadDir();7. Rate Limiting Service
Description
Protect your API from abuse with configurable rate limits.
Rate limiting is a critical security and performance mechanism that prevents API abuse by restricting the number of requests a client can make within a specified time window. Without rate limiting, malicious actors could overwhelm your server with thousands of requests, causing service degradation, consuming resources, or even causing a denial-of-service (DoS) attack. Legitimate users might also inadvertently make too many requests, impacting other users' experience. This service implements a token bucket algorithm that tracks requests per client (identified by IP address or custom keys) and enforces limits. When a client exceeds their limit, the service returns a 429 "Too Many Requests" status, protecting your backend resources. The service is flexible - you can set different limits for different routes, skip rate limiting for certain paths (like health checks), or use custom key generators (e.g., rate limit by user ID instead of IP for authenticated endpoints).
Features:
- ✅ IP-based or custom key generation
- ✅ Configurable time windows
- ✅ Automatic cleanup of expired entries
- ✅ Rate limit headers in responses
- ✅ Skip conditions for specific routes
- ✅ Custom rate limiters per route
How it Works
import { RateLimitService } from "devstackbackend";
// Enable rate limiting (recommended)
RateLimitService(app, {
windowMs: 15 * 60 * 1000, // 15 minutes
maxRequests: 100, // Limit each IP to 100 requests per windowMs
});Endpoints:
GET /api/rate-limit/status- Response:
{ remaining: number, limit: number, resetTime: string }
- Response:
Response Headers:
X-RateLimit-Limit: Maximum requests allowedX-RateLimit-Remaining: Remaining requestsX-RateLimit-Reset: Reset time (ISO string)
Custom Rate Limiter:
const rateLimitService = RateLimitService(app, config);
// Apply stricter limits to specific route
app.post("/api/sensitive", rateLimitService.createLimiter(60000, 5), (req, res) => {
// Max 5 requests per minute
res.json({ message: "Success" });
});8. Cache Service
Description
In-memory caching with TTL (Time-To-Live) and LRU eviction.
Caching is a performance optimization technique that stores frequently accessed data in fast, temporary storage (memory) to avoid expensive operations like database queries or complex calculations. When you cache a response, subsequent requests for the same data can be served instantly from memory instead of re-computing or re-fetching from slower storage systems. This dramatically improves response times and reduces load on your database. The Cache Service implements an intelligent caching system with Time-To-Live (TTL) - cached data automatically expires after a specified duration, ensuring users don't receive stale data. When the cache fills up, it uses LRU (Least Recently Used) eviction - removing the least recently accessed entries to make room for new ones. The service provides middleware that can automatically cache route responses, so you can cache expensive operations (like complex database queries or API calls) with a single line of code.
Features:
- ✅ TTL (Time-To-Live) support
- ✅ LRU (Least Recently Used) eviction
- ✅ Automatic cleanup of expired entries
- ✅ Cache middleware for routes
- ✅ Programmatic cache control
How it Works
import { CacheService } from "devstackbackend";
// Caching
CacheService(app, {
defaultTTL: 3600000, // 1 hour
maxSize: 1000, // Max 1000 entries
});Endpoints:
GET /api/cache/stats- Response:
{ size: number, validEntries: number, expiredEntries: number, maxSize: number, defaultTTL: number }
- Response:
DELETE /api/cache/clear- Response:
{ message: string, entriesRemoved: number }
- Response:
DELETE /api/cache/:key- Response:
{ message: string, deleted: boolean }
- Response:
Usage:
const cacheService = CacheService(app, config);
// Cache middleware (cache for 5 minutes)
app.get("/api/data", cacheService.middleware(300000), async (req, res) => {
const data = await fetchData(); // Expensive operation
res.json(data); // Automatically cached
});
// Manual cache control
cacheService.set("key", { data: "value" }, 60000); // Cache for 1 minute
const value = cacheService.get("key");
cacheService.delete("key");
cacheService.clear(); // Clear all9. Push Notification Service
Description
Browser push notifications using Web Push API with VAPID authentication.
Push notifications enable real-time communication with users even when they're not actively using your application. Unlike traditional request-response patterns where the client must initiate communication, push notifications allow servers to proactively send messages to users' devices. This is powered by the Web Push Protocol, a standardized way for web applications to receive messages pushed from a server. The service uses VAPID (Voluntary Application Server Identification) keys to authenticate your server with browser push services, ensuring only authorized servers can send notifications to users. Users subscribe to notifications by granting permission through their browser, and your application stores these subscriptions. When you need to notify users (e.g., new message received, order shipped, system alert), you send notifications to all active subscriptions. This creates a direct communication channel that works across different browsers and platforms, enhancing user engagement and enabling real-time experiences.
Features:
- ✅ VAPID authentication
- ✅ Broadcast to all subscribers
- ✅ Programmatic API methods
How it Works
import { PushNotificationService } from "devstackbackend";
// Push Notifications
PushNotificationService({
app,
publicKey: process.env.VAPID_PUBLIC_KEY,
privateKey: process.env.VAPID_PRIVATE_KEY,
email: "mailto:admin@example.com",
});Generate VAPID keys:
npx web-push generate-vapid-keysEndpoints:
POST /subscribe- Body:
PushSubscriptionobject (from browser) - Response:
{ message: "Subscription added" }
- Body:
POST /send- Body:
{ title?: string, body?: string } - Response:
{ message: string, results: Array }
- Body:
Frontend Integration:
// In your frontend code
const registration = await navigator.serviceWorker.ready;
const subscription = await registration.pushManager.subscribe({
userVisibleOnly: true,
applicationServerKey: VAPID_PUBLIC_KEY // from your backend
});
// Send subscription to backend
await fetch("/subscribe", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(subscription)
});Programmatic Usage:
const pushService = PushNotificationService(config);
// Send notification manually
await pushService.sendAll({ title: "Hello", body: "World" });
// Add subscription manually
pushService.addSubscription(subscription);10. Payment Service
Description
Unified payment integration supporting both Stripe and Razorpay.
Payment processing is complex - it involves securely handling sensitive financial information, complying with PCI-DSS regulations, supporting multiple payment methods, and integrating with payment gateways. The Payment Service abstracts this complexity by providing a unified interface to multiple payment providers (Stripe and Razorpay). Instead of implementing separate code for each provider, you use one consistent API. When processing a payment, the service initiates payment on the server side (where credentials are secure), generates payment tokens or secrets, and returns them to the client. The client then uses these tokens with the payment provider's SDK to complete the transaction securely - this ensures sensitive card information never touches your server, reducing your PCI compliance burden. The service also provides payment verification endpoints to confirm successful transactions, preventing fraudulent claims and ensuring payment integrity. This unified approach allows you to support multiple payment methods and switch providers without changing your application code.
Features:
- ✅ Support for both Stripe and Razorpay
- ✅ Payment verification (Razorpay)
- ✅ Payment status checking (Stripe)
- ✅ Server-side payment initiation
How it Works
import { PaymentService } from "devstackbackend";
// Payments
PaymentService(app, {
stripeSecretKey: process.env.STRIPE_SECRET_KEY,
stripePublicKey: process.env.STRIPE_PUBLIC_KEY,
razorpayKeyId: process.env.RAZORPAY_KEY_ID,
razorpayKeySecret: process.env.RAZORPAY_KEY_SECRET,
});Endpoints:
POST /api/payment/create- Body:
{ amount: number, currency?: string, gateway: "stripe" | "razorpay" } - Stripe Response:
{ provider: "stripe", clientSecret: string, publicKey: string } - Razorpay Response:
{ provider: "razorpay", orderId: string, amount: number, currency: string, keyId: string }
- Body:
POST /api/payment/verify-razorpay(Razorpay only)- Body:
{ orderId: string, paymentId: string, signature: string } - Response:
{ verified: boolean, message: string }
- Body:
GET /api/payment/status/:paymentIntentId(Stripe only)- Response:
{ id: string, status: string, amount: number, currency: string }
- Response:
Stripe Frontend Integration:
const { clientSecret, publicKey } = await fetch("/api/payment/create", {
method: "POST",
body: JSON.stringify({ amount: 1000, currency: "USD", gateway: "stripe" })
}).then(r => r.json());
// Use Stripe.js to complete paymentRazorpay Frontend Integration:
const { orderId, keyId } = await fetch("/api/payment/create", {
method: "POST",
body: JSON.stringify({ amount: 1000, currency: "INR", gateway: "razorpay" })
}).then(r => r.json());
// Use Razorpay Checkout11. Logging Service
Description
Comprehensive request and error logging with optional file output.
Logging is essential for understanding what's happening in your application - it helps debug issues, monitor performance, track user behavior, audit security events, and maintain system health. Without proper logging, diagnosing production issues becomes nearly impossible. The Logging Service automatically captures and records every HTTP request and response, including timing information, status codes, IP addresses, and error details. It intercepts requests and responses transparently, so you don't need to manually add logging code to every route. The service provides structured logging that can be written to console (for development) or files (for production), with configurable formats and filtering capabilities. You can exclude certain paths from logging (like health checks or static assets) to reduce noise. Error logging captures stack traces and context, making debugging production issues much easier. Optional file logging creates daily log files that can be rotated, archived, or analyzed using log aggregation tools like ELK stack or Splunk.
Features:
- ✅ HTTP request/response logging
- ✅ Error logging with stack traces
- ✅ File-based logging (optional)
- ✅ Request/response time tracking
- ✅ Path-based filtering
- ✅ Custom log formatters
- ✅ Programmatic logging methods
How it Works
import { LoggingService } from "devstackbackend";
// Enable logging (recommended first)
LoggingService(app, {
logRequests: true,
logErrors: true,
logToFile: false, // Set to true for file logging
});Endpoints:
GET /api/logs/stats- Response:
{ logRequests: boolean, logErrors: boolean, logToFile: boolean, logFilePath: string | null, skipPaths: string[] }
- Response:
Log Format:
[2024-01-01T12:00:00.000Z] [INFO] GET /api/users 200 {"method":"GET","url":"/api/users","statusCode":200,"duration":"45ms","ip":"127.0.0.1"}Programmatic Usage:
const logger = LoggingService(app, config);
// Log messages programmatically
logger.info("User created", { userId: 123 });
logger.warn("Rate limit approaching", { ip: "127.0.0.1" });
logger.error("Database error", { error: err.message });
logger.debug("Debug info", { data: someData });
// Close log stream (when shutting down)
logger.close();12. Database CRUD Service
Description
Dynamic model creation and automatic RESTful CRUD endpoints with filtering, sorting, and pagination.
Most applications follow similar patterns for data management: creating records, reading them, updating them, and deleting them (CRUD operations). Traditionally, you'd write repetitive code for each model - defining routes, validation, error handling, pagination, filtering, and sorting. The Database CRUD Service eliminates this boilerplate by automatically generating complete RESTful APIs from simple schema definitions. You define your data structure once (e.g., "Product has name, price, category"), and the service automatically creates all necessary endpoints with industry-standard features. It implements best practices like pagination (preventing large result sets from overwhelming clients), filtering (allowing queries like "all products under $100"), sorting (ordering results by price or date), and field selection (returning only requested data). The service also provides bulk operations for efficiently handling multiple records at once. This approach dramatically accelerates development - instead of writing hundreds of lines of CRUD code, you define a schema and get a complete, production-ready API automatically. It's particularly powerful for prototyping, admin panels, or building APIs where you need quick, standard CRUD operations.
Features:
- ✅ Automatic RESTful endpoint generation
- ✅ Dynamic model creation from schema definitions
- ✅ Pagination, filtering, sorting, field selection
- ✅ Query parameter parsing with range operators
- ✅ Bulk operations (create, update, delete)
- ✅ Automatic validation using Mongoose
- ✅ Timestamps (createdAt, updatedAt) automatically added
- ✅ Optional authentication middleware per model
- ✅ Method-level access control
- ✅ Custom middleware support
- ✅ Programmatic model access
How it Works
import { DatabaseService } from "devstackbackend";
// Database CRUD Operations
const dbService = DatabaseService(app, {
basePath: "/api",
enableAutoId: true,
// authMiddleware: yourAuthMiddleware, // Optional
});
// Register a model with automatic CRUD endpoints
dbService.registerModel("Product", {
name: { type: String, required: true },
price: { type: Number, required: true },
description: String,
category: String,
inStock: { type: Boolean, default: true },
}, {
path: "products", // Custom path (default: model name lowercase)
requireAuth: false, // Set to true to require authentication
allowedMethods: ["GET", "POST", "PUT", "DELETE"], // Control which methods are available
});Endpoints (automatically generated):
GET /api/products- List all (with pagination, filtering, sorting)GET /api/products/:id- Get one by IDPOST /api/products- Create newPUT /api/products/:id- Update by IDDELETE /api/products/:id- Delete by IDPOST /api/bulk/:modelName- Bulk operations (create/update/delete)GET /api/models- List all registered models
Query Parameters (for GET /api/products):
page- Page number (default: 1)limit- Items per page (default: 10)sort- Sort fields (e.g.,-createdAt,pricefor newest first, then by price)fields- Select specific fields (e.g.,name,price)category- Filter by exact matchprice_gte- Filter: price >= valueprice_lte- Filter: price <= valueprice_min- Filter: price >= value (alias for price_gte)price_max- Filter: price <= value (alias for price_lte)name_like- Text search (case-insensitive regex)
Examples:
// GET /api/products?page=1&limit=20&sort=-createdAt&category=electronics
// GET /api/products?price_min=100&price_max=500&name_like=laptop
// Create a product
POST /api/products
{
"name": "Laptop",
"price": 999.99,
"category": "electronics",
"description": "High-performance laptop",
"tags": ["gaming", "portable"]
}
// Update a product
PUT /api/products/:id
{
"price": 899.99,
"inStock": false
}
// Bulk create
POST /api/bulk/Product
{
"operation": "create",
"data": [
{ "name": "Product 1", "price": 100 },
{ "name": "Product 2", "price": 200 }
]
}
// Bulk update
POST /api/bulk/Product
{
"operation": "update",
"filter": { "category": "electronics" },
"data": { "inStock": false }
}
// Bulk delete
POST /api/bulk/Product
{
"operation": "delete",
"filter": { "inStock": false }
}Response Format:
{
"success": true,
"data": { /* document or array */ },
"pagination": {
"page": 1,
"limit": 10,
"total": 100,
"pages": 10
},
"message": "Operation successful"
}Programmatic Usage:
// Get model programmatically
const Product = dbService.getModel("Product");
// Use model directly
const products = await Product.find({ category: "electronics" });
// Get all registered models
const models = dbService.getRegisteredModels(); // ["Product", "Order", ...]Data Model
User Schema
{
email: String (unique, required),
password: String (hashed with bcrypt),
name: String (required)
}Best Practices
Production Checklist
Environment Variables
- ✅ Use strong, randomly generated
JWT_SECRET - ✅ Never commit
.envfiles - ✅ Use environment-specific configurations
- ✅ Use strong, randomly generated
Security
- ✅ Use HTTPS in production (required for push notifications and payments)
- ✅ Implement rate limiting on all public endpoints
- ✅ Validate and sanitize all user inputs
- ✅ Use secure password policies
Storage
- ✅ Store OTPs in database for production (not in-memory)
- ✅ Store push subscriptions in database for persistence
- ✅ Use Redis for distributed caching and rate limiting
- ✅ Store uploaded files in cloud storage (S3, Cloudinary, etc.)
Monitoring
- ✅ Enable logging service with file output
- ✅ Monitor rate limit violations
- ✅ Set up error tracking (Sentry, etc.)
- ✅ Monitor payment failures
Performance
- ✅ Use caching for expensive operations
- ✅ Implement database indexing
- ✅ Use CDN for static file serving
- ✅ Enable compression middleware
Error Handling
All services follow consistent error response format:
{
"error": "Error type",
"message": "Human-readable error message",
// ... additional fields
}Common HTTP Status Codes:
200- Success201- Created400- Bad Request (validation errors)401- Unauthorized (missing/invalid token)404- Not Found429- Too Many Requests (rate limit exceeded)500- Internal Server Error
Examples
Complete Authentication Flow
// 1. Sign up
const signupRes = await fetch("/signup", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ name: "John", email: "john@example.com", password: "secure123" })
});
const { token } = await signupRes.json();
// 2. Get profile
const profileRes = await fetch("/me", {
headers: { "Authorization": `Bearer ${token}` }
});
const profile = await profileRes.json();
// 3. Update profile
await fetch("/me", {
method: "PUT",
headers: {
"Authorization": `Bearer ${token}`,
"Content-Type": "application/json"
},
body: JSON.stringify({ name: "John Doe" })
});File Upload Example
const formData = new FormData();
formData.append("file", fileInput.files[0]);
const res = await fetch("/api/upload", {
method: "POST",
body: formData
});
const { file } = await res.json();
console.log(file.url); // Use this URL to display the fileCaching Expensive Operations
// Cache expensive database query for 5 minutes
app.get("/api/reports", cacheService.middleware(300000), async (req, res) => {
const reports = await generateComplexReport(); // Expensive operation
res.json(reports); // Automatically cached
});Migration Guide
Upgrading from Previous Versions
v1.0.17+ Changes:
- ✅ Added 6 new services (File Upload, Rate Limiting, Cache, Email, Logging, Database CRUD)
- ✅ Improved Email OTP Service with validation and rate limiting
- ✅ Added payment verification endpoints
- ✅ Enhanced error handling across all services
- ✅ Added programmatic API methods to services
- ✅ Database CRUD Service with automatic RESTful endpoint generation
Breaking Changes:
- None! All changes are backward compatible.
Troubleshooting
Common Issues
Email not sending:
- Check email credentials and app password (for Gmail)
- Verify SMTP service configuration
- Check firewall/network restrictions
File uploads failing:
- Ensure upload directory has write permissions
- Check file size limits
- Verify allowed MIME types/extensions
Rate limiting too strict:
- Adjust
windowMsandmaxRequestsin RateLimitService config - Implement custom key generator if needed
Cache not working:
- Check cache size limits
- Verify TTL values
- Ensure middleware is applied before route handler
Contributing
This is an open-source project and contributions are welcome! 🎉
We appreciate all forms of contributions:
- 🐛 Bug reports
- 💡 Feature requests
- 📝 Documentation improvements
- 🔧 Code contributions
- ⭐ Starring the project
How to Contribute
- Fork the repository
- Create a feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Guidelines
- Follow the existing code style
- Add tests for new features
- Update documentation as needed
- Keep commits clear and descriptive
- Be respectful and constructive in discussions
Thank you for contributing! 🙏
License
ISC
Owner
Sanchit Mehta
Open Source
This is an open-source project maintained by Sanchit Mehta. We welcome contributions from the community!
If you find this project helpful, please consider:
- ⭐ Starring the repository
- 🐛 Reporting bugs
- 💡 Suggesting new features
- 🤝 Contributing code
- 📢 Sharing with others
Made with ❤️ for developers who want to ship fast