Understanding Log Files

Salesforce B2C Commerce supports various log files. Many of them are visible by selecting Administration > Site Development > Development Setup and viewing the WebDAV Access section. In this section you will see links to log files and path names. There are two primary log file links in this section, which access logs stored in a shared file system:

Other B2C Commerce log information includes:

Log Files

You can find core system and custom application log files in the Log Files link. Select Administration > Site Development > Development Setup. In the WebDAV Access section, click the Log files link and you will be asked to enter a user name and password. The Log files home page (Index of /) shows a list of log files in alphabetical order.

Note: With the right permission, you can also access the log files via a WebDAV client. See Understanding the WebDAV Transfer Log.

What Are the Types of Log Files?

There are two basic types of log files: from the core B2C Commerce environment and from custom code. You can easily differentiate the file type because custom log files begin with custom.

How often is the information captured and deleted?

Log files are captured daily, stored for 30 days, and then automatically deleted. If needed, you can delete log files more frequently using WebDAV. After five days, the log files are moved into a Log/Archive directory and compressed into qzip files. If you want to retain log files longer than 30 days, you must download the files and store them locally.

Are there any size or time Limitations?

Allowing files to grow indefinitely can cause performance and storage issues; so file size can be restricted by time or size. For example, a file can be allowed to collect data for 10 minutes or to save up to 10 megabytes (see the table below).

You can write up to 10 MB per day into the customdebug, custominfo, customfatal, customerror and customwarn log files. The same limit applies to the amount of fatal, error, warn, debug and info messages that can be stored in custom named log files. (See the Custom Named Log Files section below.)

A day is 00:00 to 24:00 (12:00 a.m. to midnight next day) in a site time zone. Logging will be suspended until the next day (00:00) once the 10 MB limit has been reached. If the maximum possible log amount is almost reached, for example, the log file size is 9.9 MB, and the next log message will exceed the limit, the log message is still written, but only up to an additional 100 KB beyond the 100 MB. This lets you use the log content's full permitted amount. When the log file has reached its maximum size for the day, the following info message will be written to the log file:
+++++++++++ Maximum log file size per day reached, logging suspended. +++++++++++
Note: System log settings have no storage limit.

What are the log Levels?

One log file exists per log level, per application server. Log levels for system and customization include:

The debug log level is always disabled on Production instances, and enabled for in memory logging on non-production instances. In memory logging allows for the Show Request Log feature, but doesn't generate log files.

This table describes how the various log files work - how you activate them, how long they are retained, and their purpose:

Log file How do I activate/configure this? How long is it retained by time or file size? Purpose See also
dbinit-sql Not configurable 30 days SQL used during dbinit.  
deprecation Not configurable 30 days Shows usage of deprecated APIs: date and time, API name, number of times used. If the file is empty, no deprecated APIs were used. API documentation
analytics Not configurable 30 days Shows any activity with B2C Commerce analytics. Review these for messages for active merchandising reports or any errors.

Analytics

api Not configurable 30 days Shows API problems: area (Pipeline Dictionary, Pipeline) date and time, problem type (Error, Info, Template, Warn), path, detail type (Key, Pipelet, Pipeline, Site) and problem details. Review these for script usage and violation messages. API documentation.
customdebug

Administration > Operations > Custom Log Settings

Logger.debug in script

10 megabytes after clicking the Log To File button

Repeatable

5 days

Review this for debug messages in custom jobs, imports, payment or code that could impact users.

Configuring Custom Logging Categories

customerror

Administration > Operations > Custom Log Settings

Logger.error in script

10 megabytes after clicking the Log To File button

Repeatable

5 days

Review this for errors in custom jobs, imports, payment or code that could impact users.  
customfatal

Administration > Operations > Custom Log Settings

Receive Email: Enter a comma-separated list of valid email addresses to enable email notification of Fatal messages. Email notifications are sent once every minute.

10 megabytes after clicking the Log To File button

Repeatable

5 days
Review this for fatal errors in custom jobs, imports, payment or code.  
custominfo

Administration > Operations > Custom Log Settings

Logger.info in script

10 megabytes after clicking the Log To File button

Repeatable

5 days

Review this for informational logs in in custom jobs, imports, payment or code.  
customwarn

Administration > Operations > Custom Log Settings

Logger.warn in script

Always logged to file.

Repeatable

5 days

Review this for warn messages in custom jobs, imports, payment or code that could impact users.  
debug Not configurable 30 days Shows debug information for the entire site if the Debug flag is enabled. Use this with SOAP testing.

Configuring Custom Logging Categories

error Not configurable 30 days Shows errors in B2C Commerce scripts, templates, core code and other areas.  
fatal Not configurable 30 days Shows fatal errors in B2C Commerce scripts, templates, core code and other areas.  
info Not configurable 30 days Shows information logs reported on B2C Commerce scripts, templates, core code and other areas.  
jobs Not configurable 30 days Shows job status information: date and time, area, status (for example, Job Manager Stopped) on all Salesforce B2C Commerce instances and custom jobs.  
migration Not configurable 30 days Internal migration data.  
performance Not configurable 30 days Internal performance data. Not available in PIG instances.  
quota Not configurable 30 days

Contains B2C Commerce quota details such as: object quotas, object relation quotas and API quotas. Select also Administration > Operations > Quota Status to view status (in addition to logs).

Governance and Quotas and Quota Log File Format

sql Not configurable 30 days Review these if you have replication issues.  
staging Replication info during replication. Not configurable. 30 days Information on B2C Commerce data and code replication processes (only on production, staging and development instances).  
sysevent Not configurable 30 days Shows Appserver registration and cartridge related logs. Shows date and time, event (for example, Active code version set to mobile).  
syslog Not configurable 30 days Shows information that is related to API processing, data staging and error handling. Also shows import/export related task info, host information, and code version activation. These logs might provide useful information when troubleshooting import/export related issues.

Import Export Analytics and Reporting

console Not configurable 30 days

Internal console log entries. Not available in PIG instances.

Before 18.1, the console log file was named tomcat.log. Now the name of the console log file follows this pattern:

console-hostname-appserver_id-server_name-timestamp.log

In this pattern, timestamp is in yyyyMMdd format.

Each entry contains: date and time, path, command (init, load, start, log), and message type (Error, Info, Severe, Warning).

 
warn Administration > Operations > Custom Log Settings 30 days Shows lock status reports (good for job info), slot warnings and warnings for servlets. Shows date and time, WARN, area (RequestHandlerServlet, Rendering, server|application name, message details.  

Redundancy Tracking

Error and Warning messages are tracked for redundancy. If a message appears more than 10 times in 3 minutes, it will be suppressed if it reappears within the next 3 minutes. When suppression begins, the following text will precede the log message text within the log entry:

The following message was generated more than 10 times within the last 180 seconds.
It will be suppressed for 180 seconds:

The tracking of repeated messages continues while the suppression is in place. After the suppression period has ended, if a new log message appears whose occurrence rate is over the 10 message per 180 second threshold, it will be logged with the message described above, and a new suppression period will begin.

Import/Export Link

You can find import/export specific information in the import/export log files. In Business Manager, navigate to Administration > Site Development > Development Setup . In the WebDAV Access section, click the Import/Export link to view log details on any activity relative to import and export. This information can can provide you with critical insight into job failures or data errors. You can use this link or access the data via a WebDAV client when you perform import/export functions via Business Manager or programmatically via a pipelet. Import and export can be performed on many types of data.

Note: With the right permission, you can also access the log files via a WebDAV client. See Understanding the WebDAV Transfer Log.

You should analyze the data in these logs as part of daily operations and each time there is an issue with a related process such as catalog import, price book or custom feed.

The Import/Export directory has the following structure on all instances.

log/ - Log Directory

This directory contains logs for specific processes such as catalog/customer/product batch processing, catalog import and export, inventory import, and price book import. It also contains validation logs for Price Book, Promotions, Metadata, Coupon, Inventory, and Catalog.

processlogs_archive/ Archive of logs

src/ - B2C Commerce import/export source Files

This directory contains B2C Commerce import/export source files, which includes core processes such as catalog import and custom jobs for feeds such as affiliates, Certona, and channel advisor.

This is an example of a subdirectory structure for src/:

activedata/
affiliatefeeds/
archive/
catalog/
catalog_archive/
channeladvisor/
customer/
customization/
customobject/
export/
instance/
library/
operations/
order/
upload/ (Directory for upload of files before processing)

See Import/Export Analytics and Reporting and Import/Export Error Handling.

Import/export files are deleted in the IMPEX/log folder 30 days after creation on Staging and Production instances and 7 days after creation on Sandbox and Development instances.

Custom Named Log Files

The method public static Log getLogger( String fileNamePrefix, String category ) enables you to obtain a log instance and write into a custom log file through that log instance. When you call the method with the given file name prefix the first time, B2C Commerce will create a log instance and a corresponding log file in the usual log directory. When you call the method with the same file name prefix a second time, B2C Commerce will return the same log instance as the first call, and all log messages are stored in the same log file.

The log messages are written with the same pattern as used in system and custom logs, as follows:

custom - <file name prefix> - <hostname> - appserver - <creation date of the file in GMT>.log

This is an example:

custom-prefix-blade0-1-appserver-20110706.log

The file name prefix must follow these rules:

The log categories provided through Business Manager for the custom script log are also applied to this type of logger.

Note: Governance allows only 40 different log file names (therefore, 40 log instances and log files) per day. This quota is a hard limit and is counted per appserver.

See Configuring Custom Logging Categories.

Quota

The quota log contains B2C Commerce quota details such as:

See Governance and Quotas.

Security Logs

Security log files are located as follows: https://<instance name>.demandware.net/on/demandware.servlet/webdav/Sites/Securitylogs.

Security log entries can look like this:

[2015-10-28 02:23:19.139 GMT] [DW-SEC] (User: 'username' (Sites), IP: 100.100.10.100 [LOGIN] : logged in.)

The security log also includes:

Additional Tools

Additional tools include the request log console and batch processing logs.

Request Log Console

You can use the Request Log Console in the Storefront Toolkit when you troubleshoot server calls. The request log shows the last server call made and any server calls made while the tool is open. You can also view AJAX calls or other script calls. If you are using this tool to debug a script, you can use custom logging in your script if you configure the custom log settings in Business Manager, as described below. The Request Log tool shows exception information from the server and the custom logging in real time. Enable the storefront in Business Manager as a site preference, and then in the site > Storefront Toolkit > Request Log.

See Using the Request Log Console.

Batch Processing Logs

You can perform batch processing on catalogs, customers and products. Product/catalog batch processing and customer batch processing are not combined into the same interface. View the results of either from their respective Business Manager sections. For example, you can view customer batch processes by selecting site > Merchant Tools > Customers > Batch Processing, but not by selecting site > Merchant Tools > Products and Catalogs > Batch Processing.

Batch processing logs can contain error reports as a result of a failed batch process. A log file can contain WARN , ERROR or FATAL log messages. You can view the logs from the specific Batch Processing page, or from the Import/Export log directory, for example:

\sharedata\sites\Sites-Site\units\Sites\impex\log\Batch-Customer-20100128192425062.log.

See the Import/Export section above. See Batch Processing.

Related Links

Understanding the WebDAV Transfer Log

Adding Logging to Your Scripts

Analytics

Configuring Custom Logging Categories

Governance and Quotas

Quota Log File Format

Import/Export Analytics and Reporting

Understanding the WebDAV Transfer Log

Import/Export Error Handling

Using the Request Log Console

Batch Processing

File Manager Job for Deletion of Logs

Troubleshooting