Package Exports
- ps-chronicle
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (ps-chronicle) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
ps-chronicle
A logger package which is PS specific and uses winston internally. It is used to create logs in 2 formats -> JSON or Simple. The default format is JSON. The log levels allowed are : error, wspayload, info, warn, debug. The timezone is GMT
installation
npm i ps-chronicle
usage
Step 1) Import/require package
const Logger = require('ps-chronicle');Step 2) Initialize your logger - once per file/class inside your lambda handler . All three fields are optional If outputFormat is not given then the default format will be "json", If loglevel is not set during initialization, then by default it will "debug"
var outputFormat = 'json' or 'simple'
var logger = new Logger('example.js', 'json', 'info'); // (fileName, outputFormat, logLevel)Step 3) Set requestID per lambda. This request id can be obtained from context object of the lambda. For more information please visit https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html Please set request Id as the format ==> LoggerClassName.setRequestId(context.awsRequestId);
Logger.setRequestId(context.awsRequestId); // ClassName.methodName as it is a static methodStep 4) Set customer name per lambda.
Logger.setCustomerName('RSA'); // ClassName.methodName as it is a static methodStep 5) Also, set method name for each method.
logger.setMethodName('testMethod()');Step 6) If the log level defined in the log() function is not in the scope then the default log level ,i.e info is set. log() function parameters : i. log level ii. message iii. after above two parameters, n number of parameters can be given. All will be the part of key: additionalinfo , which is an array.
function sum(a, b) {
logger.setMethodName('sum()');
logger.log('INFO', ' testing info logger ');
logger.log('silly', 'testing silly logger '); //silly is not in scope so default log level is set -> info
logger.log(
'error',
' testing error logger ',
{ tags: 'HEADERS', headers: event.headers },
{ tags: 'BODY', body: event.body }
);
logger.log(
'wspayload',
'testing simple wspayload logger ',
{ tags: 'HEADERS', headers: event.headers },
{ tags: 'BODY', body: event.body }
);
logger.log(
'warn',
'testing simple warn logger ',
{ tags: 'HEADERS', headers: event.headers },
{ tags: 'BODY', body: event.body }
);
logger.log(
'debug',
'testing simple debug logger ',
{ tags: 'HEADERS', headers: event.headers },
{ tags: 'BODY', body: event.body }
);
//rest of the code
}Important : If loggers are not getting printed for async functions then please use the below code at the end of your lambda function:
await logger.waitForLogger(Logger);Additional methods:: Filtering logs and sending it to S3:
The below function is used to filter logs with respect to request ID and then sending those logs to S3 provided bucket should be present. PARAMS USED :
- logGroupName : context object for lambda has logGroupName . Please visit the link for more reference https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html
- region : region where the log is present. Bucket should also be present in the same region
- requestId : context object for lambda has awsRequestId . Please visit the link for more reference https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html
- logStreamName : context object for lambda has logStreamName . Please visit the link for more reference https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html
- startTime : The epoch time in millisecond . Visit https://www.epochconverter.com/
- endTime : The epoch time in millisecond . Visit https://www.epochconverter.com/
- bucket : The name of the bucket in which the filtered logs should be stored
- folderStructure : The folder structure in which the logs should be stored inside the bucket,
- appName : The app name for which this method is being used
let params = {
logGroupName: context.logGroupName,
region: process.env.AWS_REGION,
requestId: context.awsRequestId,
logStreamName: context.logStreamName,
startTime: 1609088325000,
endTime: 1609174725000,
bucket: 'log-export-bucket-demo',
folderStructure: 'config/jobs/1075/export',
appName: 'PS_AUTOMATION',
};
const response = await logger.filterLogToS3(params);RESPONSE ::
If Successful - { "responseCode":200, "responseBody":{ "Bucket":"log-export-bucket-demo", "Key":"config/jobs/1075/export/PS_AUTOMATION_7666e842-ef43-40a5-81b6-56ceb4cc4f9a_29-12-2020 1:1:16", "ETag":"2db336acb2a5c30a6f14e49a7af6e89e" } }
If some error occurred - responseCode : 500 responseBody : error message or Internal Server Error
If bucket not found - responseCode : 404 responseBody : 'Bucket : ' {$bucketName} ' does not exist'