Are you struggling to find a step-by-step guide for Salesforce debug logs? If yes, then this blog will answer all your queries. Salesforce is a widely used CRM tool. It provides one integrated platform for sales, marketing, services and commerce. The Salesforce debug logs can help you to keep track of time, the status of transactions, etc. In this blog, you will learn about Salesforce debug logs, how to create it and how to use it to track issues.

Let’s see how this blog is structured for you:

  1. What is Salesforce Debug Logs?
  2. Create Salesforce Debug Logs
  3. View Salesforce Debug Logs
  4. Salesforce Debug Logs Category
  5. Debug Levels
  6. Limitations of a Salesforce Debug Logs
  7. Conclusion

What is Salesforce Debug Logs?

Salesforce debug logs manages the track of events (transactions) that happens in the Salesforce organization. It contains information about all the transactions that are happening on the Salesforce and keeps track of time, the status of transactions, etc.

Salesforce debug logs generates when a user uses Trace Flag. Trace flags filter the logs generated by the transaction. It contains debug level, start-end time, type of the log (ERROR, WARN, DEBUG), and the status of the job/transactions. Once you set the Trace Flag, the system will generate the debug log when a user performs the transaction. These logs can be useful for developers and integration partners.

Salesforce Debug Logs are invaluable for troubleshooting issues, especially when preparing for a migration to systems like MySQL.

Salesforce Debug Logs can hold the following information:

  1. Database changes
  2. HTTP requests
  3. Resources used and errors in Apex
  4. Automated workflow rules
  5. Start-end time
  6. Status of the transactions
Hevo Data: Migrate your Data Conveniently

Hevo is a No-code Data Pipeline. It supports pre-built data integrations from 100+ data sources(Including Free sources like Salesforce). Hevo is a consistent and reliable solution for your ETL process. You can enrich your data and transform it into analysis-ready without writing any code. You can also leverage the extensive logging capabilities of Hevo to understand how your pipeline behaves.

Get Started with Hevo for Free

Let’s discuss some unbeatable features of Hevo:

  1. Fully Automated: Hevo can be set-up in a few minutes and requires zero maintenance and management.
  2. Scalability: Hevo is built to handle millions of records per minute without any latency.
  3. Secure: Hevo offers two-factor authentication and end-to-end encryption so that your data is safe and secure.
  4. Fault-Tolerant: Hevo is capable of detecting anomalies in the incoming data and informs you instantly. All the affected rows are kept aside for correction so that it doesn’t hamper your workflow.
  5. Real-Time: Hevo provides real-time data migration. So, your data is always ready for analysis.
  6. Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
Sign up here for a 14-Day Free Trial!

Create Salesforce Debug Logs

Salesforce has an excellent user interface that allows you to perform various operations. You can create Salesforce Debug Logs from the user interface. Follow below step by step process to create Salesforce Debug Logs:

  1. On the Salesforce window, search for ‘Debug Logs’ in the search box and select Debug Logs.
  2. To set the Debug Log, click on ‘New’.
  3. Select the user, start date and expiration date (future date) to set up Debug Log.

View Salesforce Debug Logs

Let’s see how to view your Salesforce debug logs:

  1. To view the debug logs, from the setup option in Salesforce, enter ‘Debug Logs’ in the ‘Quick Find box’, then select ‘Debug Logs’. 
  2. Once you select the Debug Logs, click the ‘View’ button to examine the log.
  3. Click ‘Download’ to download the logs as an XML file.

Salesforce Debug Logs Category

Salesforce Debug Logs contain the information about the following category:

  1. Database: The Salesforce Debug logs contain all the information about database activity, data manipulation statement, SQL queries, etc.
  2. Workflow: You can analyze the workflow pipelines by analyzing the Debug Log. It contains workflow rules, flows, processes, etc. are also stored in Debug Logs. 
  3. Rule Validation: It may contain the validation of rules, such as the name of the rules and if that rule evaluated true or false.
  4. Callout: Callout contains the request-response in XML from the external web services. It helps in debugging the issues related to API’s or troubleshoot user access to external objects.
  5. Apex Code: It includes the information about Apex Code, i.e., the information about DDL/DML statement, triggers, start and completion of methods, etc.
  6. Apex Profiling: It includes profiling information, such as the number of queries executed, number of emails sent, etc.
  7. Visualforce: It includes, information about formula evaluation, serialization and deserialization of states and many more. 
  8. System: It holds all the call to system methods, such as System.Debug methods.

Debug Levels

Salesforce Debug Logs: Debug Levels

Log Levels are useful to determine the type of information logged when the workflow kicks in. Different log levels will help you to understand the process, status of the job, any internal information about code execution, etc. Below are the log levels listed from the lowest to the highest level.

  1. Error/Warn/Info: It includes any errors, warnings, and info messages from the code. It will help you to determine the status of the job, and if any job results into any error, the error log will display the error messages.
  2. Debug: It includes any debug steps or statements that are usually generated by System.debug method.
  3. Fine, Finer: It may contain all the DML statements or SQL queries and the information about the execution of user-defined methods. Also, the Fine/Finer level will also include information about the resources, SOQL/SQSL statements and method invocations. 
  4. Finest: It will include every log that is generated by Fine or Finer log levels, along with additional information about Apex Script.

The Log Levels are cumulative, i.e., if you select the highest level, all the information related to lower levels are also included. For example, if you select DEBUG level, the log will include all events logged at the INFO, WARN, and ERROR levels.

Limitations of Salesforce Debug Logs

Limitations of Salesforce debug logs are listed below:

  1. Salesforce Debug Logs can have a maximum of 20 MB in size. If the size increases beyond 20 MB, the Debug Log will automatically truncate the older log lines. These log lines are removed from any location from the file.
  2. System debug logs are retained for 24 hours. Monitoring debug logs are retained for seven days.
  3. If you are generating more than 1000 MB of logs files in 15 minutes window, Trace flags will be disabled automatically. You will receive an email with the information so that you can analyze and re-enable it.
  4. When your org accumulates more than 1000 MB of debug logs, Salesforce prevents the user in the organization from adding or editing trace flags. You need to delete some Debug logs to add or edit trace flags so that you can generate more logs after you reach the limit.

Conclusion

In this blog post, you have learned about the Salesforce Debug logs, how you can create it, and how it helps you to debug the issue. However, if you are looking for an easier solution, you should try – Hevo.

Hevo is a No-code Data Pipeline that has 100+ inbuilt connectors like Salesforce (Free Source Connector with Hevo) that can connect to any source in a minute. You can also leverage the extensive logging capabilities of Hevo to understand how your pipeline behaves.

Visit our Website to Explore Hevo

Want to take Hevo for a Spin? Sign Up and simplify your Data Integration Process.

Share your experience of working with Salesforce Debug logs in the comment section below.

Vishal Agrawal
Technical Content Writer, Hevo Data

Vishal Agarwal is a Data Engineer with 10+ years of experience in the data field. He has designed scalable and efficient data solutions, and his expertise lies in AWS, Azure, Spark, GCP, SQL, Python, and other related technologies. By combining his passion for writing and the knowledge he has acquired over the years, he wishes to help data practitioners solve the day-to-day challenges they face in data engineering. In his article, Vishal applies his analytical thinking and problem-solving approaches to untangle the intricacies of data integration and analysis.