-
Notifications
You must be signed in to change notification settings - Fork 0
Home
- Introduction
- Getting Started
- Basic Logging
- Configuration Options
- Method-Level Logging with Attributes
- Data Change Tracking
- Performance Monitoring
- Progress Logging for Long Operations
- Throttling and Rate Limiting
- Handling Sensitive Data
- Correlation and Context
- Log Routing
- Database Logging
- Asynchronous Method Logging
- Structured Logging
- Advanced Scenarios
- Best Practices
- Troubleshooting
AnnotationLogger is a comprehensive .NET logging framework designed to simplify application logging through attribute-based configuration. It provides powerful features including method interception, data change tracking, performance monitoring, and advanced routing capabilities.
Key features:
- Declarative logging through attributes
- Minimal code changes required for extensive logging
- Automatic method parameter and return value logging
- Data change tracking across method calls
- Performance monitoring and statistics
- Progress logging for long-running operations
- Support for multiple log destinations
- Protection of sensitive data
Add AnnotationLogger to your project using your preferred method:
dotnet add package AnnotationLoggerSet up AnnotationLogger in your application startup code:
using AnnotationLogger;
// Configure logger in Program.cs or Startup.cs
LogManager.Configure(config =>
{
config.Logger = new ConsoleLogger();
config.Environment = EnvironmentType.Development;
});
// Optional: Configure data change tracking
LogManager.ConfigureDataLogging(config =>
{
config.EnableDataChangeTracking = true;
config.MaxComparisonDepth = 3;
});AnnotationLogger provides direct logging methods for simple cases:
// Simple message logging
LogManager.Info("Application started");
// With context data
LogManager.Info("User logged in", new Dictionary<string, object> {
{ "UserId", 123 },
{ "Username", "johndoe" }
});
// Different log levels
LogManager.Trace("Detailed diagnostic information");
LogManager.Debug("Debugging information");
LogManager.Info("General information");
LogManager.Warning("Warning condition");
LogManager.Error("Error condition");
LogManager.Critical("Critical condition");
// Exception logging
try
{
// Some operation that might throw
}
catch (Exception ex)
{
LogManager.Exception(ex, "Error processing request");
}Configure general logging behavior:
LogManager.Configure(config =>
{
// Set the logger (can use multiple loggers with CompositeLogger)
config.Logger = new CompositeLogger(new[] {
new ConsoleLogger(LogLevel.Info),
new FileLogger(LogLevel.Debug, "application.log")
});
// Environment settings affect which logs are emitted
config.Environment = EnvironmentType.Development; // Development, Testing, Staging, Production
// Control what gets logged
config.EnableMethodEntryExit = true;
config.EnableParameterLogging = true;
config.EnableReturnValueLogging = true;
config.EnableExecutionTimeLogging = true;
// Output format
config.UseStructuredOutput = false; // Set to true for JSON output
// Advanced settings
config.MaxStringLength = 10000; // Truncate long strings
config.MaxCollectionItems = 100; // Limit collection items in logs
config.MaxObjectDepth = 3; // Limit depth for complex objects
});Configure data change tracking behavior:
LogManager.ConfigureDataLogging(config =>
{
config.EnableDataChangeTracking = true;
config.MaxComparisonDepth = 3;
config.IncludeSensitivePropertiesInChanges = false;
config.LogBeforeState = false; // Set to true to include full object state before changes
config.LogAfterState = false; // Set to true to include full object state after changes
config.DataChangeLogLevel = LogLevel.Info;
});Annotate methods with logging attributes:
public class UserService
{
[LogInfo] // Log at Info level
public User GetUser(int userId)
{
// Method implementation
return new User { Id = userId, Name = "John" };
}
[LogDebug] // Only logs in Development/Testing
public List<User> SearchUsers(string searchTerm)
{
// Method implementation
return new List<User>();
}
[LogError] // For methods where you want to capture failures
public bool UpdateUserEmail(int userId, string newEmail)
{
// Method implementation
return true;
}
[LogWarning]
public User DeleteUser(int userId)
{
// Method implementation
return null;
}
[LogCritical]
public void ImportantSystemOperation()
{
// Method implementation
}
[LogTrace] // Most detailed logging level
public void HighlyDetailedOperation()
{
// Method implementation
}
[LogProd] // Always logs even in Production
public void CriticalBusinessOperation()
{
// Method implementation
}
}Fine-tune what gets logged:
[LogInfo(
IncludeParameters = true, // Log method parameters
IncludeReturnValue = true, // Log return value
IncludeExecutionTime = true // Log execution time
)]
public User GetUserDetails(int userId, bool includeAddresses)
{
// Method implementation
return new User();
}To activate logging, call methods using the LoggedMethodCaller:
var userService = new UserService();
// For methods that return a value
User user = LoggedMethodCaller.Call(() => userService.GetUser(123));
// For void methods
LoggedMethodCaller.Call(() => userService.ImportantSystemOperation());Track changes to objects across method calls:
public class UserService
{
[TrackDataChanges]
public User UpdateUser([BeforeChange] User user, string newName)
{
user.Name = newName;
return user;
}
}
// Usage
var user = new User { Id = 1, Name = "John" };
LoggedMethodCaller.Call(() => userService.UpdateUser(user, "John Doe"));Configure detailed data change tracking:
[TrackDataChanges(
DetailedComparison = true, // Compare all properties
MaxComparisonDepth = 3, // How deep to compare nested objects
IncludeOriginalState = true, // Include the original object state in logs
IncludeUpdatedState = true, // Include the new object state in logs
OperationType = "UserUpdate" // Custom operation name in logs
)]
public User UpdateUserProfile(
[BeforeChange] User existingUser,
UserProfileUpdate updates)
{
// Apply updates
if (updates.Name != null)
existingUser.Name = updates.Name;
if (updates.Email != null)
existingUser.Email = updates.Email;
return existingUser;
}Exclude certain properties from change tracking:
public class User
{
public int Id { get; set; }
public string Name { get; set; }
public string Email { get; set; }
[ExcludeFromComparison]
public DateTime LastLoginTime { get; set; }
[ExcludeFromComparison("Frequently changes and not relevant for audit")]
public int LoginCount { get; set; }
}Manually track changes when attribute-based tracking isn't suitable:
// Create a tracker for an entity
var tracker = DataChangeTracker.Track(user, user.Id.ToString(), "UserProfileUpdate");
// Add custom context if needed
tracker.WithContext("RequestId", requestId)
.WithContext("UserId", currentUserId);
// Make changes to the entity
user.Name = "New Name";
user.Email = "new.email@example.com";
// Log the changes
var changes = tracker.LogChanges(user);Performance tracking happens automatically for logged methods:
[LogInfo]
public void PerformanceIntensiveOperation()
{
// Method implementation
}
// Call the method several times
for (int i = 0; i < 100; i++)
{
LoggedMethodCaller.Call(() => service.PerformanceIntensiveOperation());
}
// Get performance statistics
var stats = LogManager.GetPerformanceTracker().GetStats();Access and display performance data:
var stats = LogManager.GetPerformanceTracker().GetStats();
foreach (var stat in stats)
{
Console.WriteLine($"Method: {stat.Value.MethodName}");
Console.WriteLine($" Calls: {stat.Value.CallCount}");
Console.WriteLine($" Avg Time: {stat.Value.AverageTime:F2}ms");
Console.WriteLine($" Min Time: {stat.Value.MinTime}ms");
Console.WriteLine($" Max Time: {stat.Value.MaxTime}ms");
Console.WriteLine($" Median: {stat.Value.MedianTime:F2}ms");
}Get stats for a specific method:
var methodStats = LogManager.GetPerformanceTracker().GetStatsForMethod("UserService.GetUserById");
if (methodStats != null)
{
Console.WriteLine($"Method: {methodStats.MethodName}");
Console.WriteLine($"Average execution time: {methodStats.AverageTime:F2}ms");
Console.WriteLine($"Called {methodStats.CallCount} times");
}Reset performance tracking data:
// Clear all performance data
LogManager.GetPerformanceTracker().Reset();Enable automatic progress logging for long-running methods:
[LogInfo]
[WithProgressLogging(intervalMs: 1000)] // Log progress every second
public async Task ImportLargeFileAsync(string filePath)
{
// Long operation
for (int i = 0; i < 100; i++)
{
await Task.Delay(500);
// Process file...
}
}For more control, create a progress logger manually:
public async Task ProcessBatchesAsync(IEnumerable<DataBatch> batches)
{
// Create a progress logger that reports every 2 seconds
using (var progressLogger = new ProgressLogger("Batch Processing", intervalMs: 2000))
{
foreach (var batch in batches)
{
await ProcessBatchAsync(batch);
}
}
// Progress logger is automatically disposed, ending the progress tracking
}Prevent log flooding in high-frequency operations:
[LogDebug]
[ThrottleLogging(MaxLogsPerSecond = 5)]
public double CalculateValue(double input)
{
// High-frequency operation
return Math.Pow(input, 2);
}Throttling works with other logging attributes:
[LogInfo]
[ThrottleLogging(MaxLogsPerSecond = 10)]
[WithProgressLogging(intervalMs: 5000)]
public async Task ProcessHighVolumeDataAsync()
{
// Process many items with throttled logging
for (int i = 0; i < 10000; i++)
{
// Process data...
await Task.Delay(10);
}
}Protect sensitive data in logs:
public class User
{
public int Id { get; set; }
public string Username { get; set; }
[MaskInLogs] // Replaces value with "***"
public string Password { get; set; }
[MaskInLogs(
ShowFirstChars = true, FirstCharsCount = 3,
ShowLastChars = true, LastCharsCount = 4
)]
public string Email { get; set; } // "joh***mple.com"
[RedactContents(ReplacementText = "[CREDIT CARD REDACTED]")]
public string CreditCardNumber { get; set; }
[ExcludeFromLogs]
public string ApiKey { get; set; } // Completely excluded from logs
}Customize how sensitive data is masked:
[MaskInLogs(
MaskingPattern = "•••", // Custom mask character
ShowFirstChars = true,
FirstCharsCount = 1
)]
public string SSN { get; set; } // "1•••"Document why data is sensitive:
[ExcludeFromLogs("Contains PII that should never be logged")]
public string SocialSecurityNumber { get; set; }
[MaskInLogs("Financial information - partially visible for debugging only")]
public string AccountNumber { get; set; }Track related operations across different components:
// Start a new logical operation
CorrelationManager.StartNewCorrelation();
// Get the current correlation ID
string correlationId = CorrelationManager.CurrentCorrelationId;
// Pass the correlation ID to other systems
httpClient.DefaultRequestHeaders.Add("X-Correlation-ID", correlationId);
// Later, in another component:
string incomingCorrelationId = httpRequest.Headers["X-Correlation-ID"];
if (!string.IsNullOrEmpty(incomingCorrelationId))
{
CorrelationManager.CurrentCorrelationId = incomingCorrelationId;
}Enrich logs with contextual information:
// Add global context that will be included in all logs
LogManager.AddContext("ApplicationName", "MyAwesomeApp");
LogManager.AddContext("EnvironmentName", "Production");
LogManager.AddContext("Version", "1.2.3");
// For a specific user session:
LogManager.AddContext("SessionId", sessionId);
LogManager.AddContext("UserId", userId);
// Remove context when no longer needed
LogManager.RemoveContext("SessionId");
// Clear all context
LogManager.ClearContext();Route logs to different destinations based on content:
// Create a router with a default console logger
var router = new LogRouter(new ConsoleLogger())
// Send all errors to an error log file
.AddRoute(entry => entry.Level >= LogLevel.Error,
new FileLogger(LogLevel.Error, "errors.log"))
// Send debug logs to a debug file
.AddRoute(entry => entry.Level == LogLevel.Debug,
new FileLogger(LogLevel.Debug, "debug.log"))
// Send security-related logs to a secure log
.AddRoute(entry => entry.Message.Contains("security") ||
entry.Message.Contains("auth"),
new FileLogger(LogLevel.Info, "security.log"));
// Configure the log manager to use the router
LogManager.Configure(config =>
{
config.Logger = router;
});Create more complex routing rules:
var router = new LogRouter(new ConsoleLogger())
// Send specific component logs to their own files
.AddRoute(
entry => entry.ClassName.Contains("Payment"),
new FileLogger(LogLevel.Info, "payment-service.log")
)
.AddRoute(
entry => entry.ClassName.Contains("User") &&
entry.MethodName.Contains("Auth"),
new FileLogger(LogLevel.Info, "user-auth.log")
)
// Send data change logs to a database
.AddRoute(
entry => entry.HasDataChanges &&
entry.EntityType == "User",
new DatabaseLogger(LogLevel.Info, "Data Source=audit.db")
)
// Send high-volume debug logs to a separate location
.AddRoute(
entry => entry.Level == LogLevel.Debug &&
entry.ClassName.Contains("HighVolume"),
new FileLogger(LogLevel.Debug, "high-volume-debug.log")
);Configure logging to a SQLite database:
// Create a database logger
var dbLogger = new DatabaseLogger(
LogLevel.Info,
"Data Source=logs.db"
);
// Use as a standalone logger
LogManager.Configure(config =>
{
config.Logger = dbLogger;
});
// Or as part of a composite logger
LogManager.Configure(config =>
{
config.Logger = new CompositeLogger(new[] {
new ConsoleLogger(),
dbLogger
});
});Ensure your database logging handles growth properly:
// For file-based logging with rotation
string baseDirectory = AppDomain.CurrentDomain.BaseDirectory;
string logDirectory = Path.Combine(baseDirectory, "logs");
Directory.CreateDirectory(logDirectory);
// Daily log files
string timestamp = DateTime.Now.ToString("yyyyMMdd");
string logFilePath = Path.Combine(logDirectory, $"application_{timestamp}.log");
// Database location
string dbPath = Path.Combine(logDirectory, "logs.db");
LogManager.Configure(config =>
{
config.Logger = new CompositeLogger(new[] {
new ConsoleLogger(),
new FileLogger(LogLevel.Info, logFilePath),
new DatabaseLogger(LogLevel.Info, $"Data Source={dbPath}")
});
});Handle asynchronous methods properly:
public class AsyncService
{
[LogInfo]
public async Task<User> GetUserAsync(int userId)
{
await Task.Delay(100); // Simulate API call
return new User { Id = userId, Name = "John" };
}
[LogWarning]
public async Task ProcessDataAsync()
{
await Task.Delay(500);
// Process data
}
}
// Call async methods with logging
var service = new AsyncService();
// For methods that return a value
User user = await LoggedMethodCaller.CallAsync(() => service.GetUserAsync(123));
// For methods that don't return a value
await LoggedMethodCaller.CallAsync(() => service.ProcessDataAsync());Track data changes in async methods:
[TrackDataChanges(IncludeOriginalState = true, IncludeUpdatedState = true)]
public async Task<User> UpdateUserAsync([BeforeChange] User user, UserUpdateRequest request)
{
await Task.Delay(100); // Simulate DB call
// Update user properties
user.Name = request.Name;
user.Email = request.Email;
return user;
}
// Usage
await LoggedMethodCaller.CallAsync(() => service.UpdateUserAsync(user, updateRequest));Enable structured JSON logging for better downstream processing:
LogManager.Configure(config =>
{
// Enable structured JSON output
config.UseStructuredOutput = true;
// Use FileLogger with structured output
config.Logger = new FileLogger(
LogLevel.Info,
"logs/application.json",
useStructuredOutput: true
);
});Use different formats for different log destinations:
LogManager.Configure(config =>
{
config.Logger = new CompositeLogger(new[] {
// Human-readable console output
new ConsoleLogger(LogLevel.Info, useStructuredOutput: false),
// JSON files for machine processing
new FileLogger(LogLevel.Info, "logs/application.json", useStructuredOutput: true),
// Database for structured querying
new DatabaseLogger(LogLevel.Info, "Data Source=logs.db")
});
});Create a custom logger for special needs:
public class CloudLogger : ILogger
{
private readonly LogLevel _minimumLevel;
private readonly string _apiKey;
private readonly string _endpoint;
public CloudLogger(string apiKey, string endpoint, LogLevel minimumLevel = LogLevel.Info)
{
_apiKey = apiKey;
_endpoint = endpoint;
_minimumLevel = minimumLevel;
}
public void Log(LogEntry entry)
{
if (!IsEnabled(entry.Level)) return;
// Convert the entry to the format expected by your cloud service
var payload = new
{
timestamp = entry.Timestamp.ToString("o"),
level = entry.Level.ToString(),
message = entry.Message,
correlationId = entry.CorrelationId,
// Include other properties as needed
};
// Send to cloud service (implement with HttpClient)
Task.Run(async () =>
{
// Implement your API call here
// await httpClient.PostAsJsonAsync(_endpoint, payload);
});
}
public bool IsEnabled(LogLevel level)
{
return level >= _minimumLevel;
}
}
// Usage
LogManager.Configure(config =>
{
config.Logger = new CompositeLogger(new[] {
new ConsoleLogger(),
new CloudLogger(
apiKey: "your-api-key",
endpoint: "https://logging-service.example.com/api/logs"
)
});
});Bridge to existing logging frameworks:
// Bridge to Serilog
public class SerilogLogger : ILogger
{
private readonly Serilog.ILogger _serilogLogger;
public SerilogLogger(Serilog.ILogger serilogLogger)
{
_serilogLogger = serilogLogger;
}
public void Log(LogEntry entry)
{
// Map AnnotationLogger levels to Serilog levels
var level = entry.Level switch
{
LogLevel.Trace => Serilog.Events.LogEventLevel.Verbose,
LogLevel.Debug => Serilog.Events.LogEventLevel.Debug,
LogLevel.Info => Serilog.Events.LogEventLevel.Information,
LogLevel.Warning => Serilog.Events.LogEventLevel.Warning,
LogLevel.Error => Serilog.Events.LogEventLevel.Error,
LogLevel.Critical => Serilog.Events.LogEventLevel.Fatal,
_ => Serilog.Events.LogEventLevel.Information
};
// Create properties dictionary
var properties = new Dictionary<string, object>();
if (entry.Parameters != null)
properties["Parameters"] = entry.Parameters;
if (entry.ReturnValue != null)
properties["ReturnValue"] = entry.ReturnValue;
if (entry.Context != null)
foreach (var ctx in entry.Context)
properties[ctx.Key] = ctx.Value;
// Log to Serilog
_serilogLogger
.ForContext("ClassName", entry.ClassName)
.ForContext("MethodName", entry.MethodName)
.ForContext("CorrelationId", entry.CorrelationId)
.Write(level, entry.Exception, entry.Message);
}
public bool IsEnabled(LogLevel level)
{
// Map and check Serilog level
var serilogLevel = level switch
{
LogLevel.Trace => Serilog.Events.LogEventLevel.Verbose,
LogLevel.Debug => Serilog.Events.LogEventLevel.Debug,
LogLevel.Info => Serilog.Events.LogEventLevel.Information,
LogLevel.Warning => Serilog.Events.LogEventLevel.Warning,
LogLevel.Error => Serilog.Events.LogEventLevel.Error,
LogLevel.Critical => Serilog.Events.LogEventLevel.Fatal,
_ => Serilog.Events.LogEventLevel.Information
};
return _serilogLogger.IsEnabled(serilogLevel);
}
}Establish consistent patterns for attribute usage:
// Guidelines:
// - Use LogDebug for internal/development diagnostics
// - Use LogInfo for standard operational messages
// - Use LogWarning for unusual but non-error conditions
// - Use LogError for failures that can be handled
// - Use LogCritical for serious failures that might require intervention
// Examples:
[LogDebug]
private void InternalCalculation() { }
[LogInfo]
public User CreateUser(CreateUserRequest request) { }
[LogWarning]
public bool ValidateUserInput(string input) { }
[LogError]
public void ProcessPayment(Payment payment) { }
[LogCritical]
public void UpdateSystemConfiguration(SystemConfig config) { }Structure your logs for easier analysis:
- Use consistent method names: Follow naming conventions for clarity
- Group related functionality: Keep similar functionality in the same classes
- Use appropriate log levels: Don't overuse high-severity levels
- Include contextual information: Add relevant context to logs
- Handle performance-sensitive areas: Use throttling for high-volume logs
Establish a clear strategy for sensitive data:
- Identify sensitive data: Audit your models to identify sensitive fields
-
Apply appropriate attributes: Use
ExcludeFromLogs,MaskInLogs, orRedactContents -
Document with reasons: Use the
reasonparameter to explain sensitivity - Review third-party models: Apply attributes to DTOs from external libraries
- Test sensitive data handling: Verify that sensitive data is protected
Optimize logging for high-traffic applications:
- Use throttling for high-frequency methods
- Be selective with data change tracking depth
- Consider log levels carefully - not everything needs to be Info
-
Use parallel logging with
CompositeLoggerfor better throughput - Implement batching for database or cloud loggers
- Monitor logging performance with performance tracking
- Rotate logs to manage storage requirements
Issue: Logs aren't being generated for attributed methods
Solution:
- Ensure you're using
LoggedMethodCaller.Call()to invoke the methods - Check that the log level is enabled in your configuration
- Verify that your logger is properly configured
Issue: Performance impact is too high
Solution:
- Apply throttling to high-frequency methods
- Reduce the
MaxComparisonDepthfor data change tracking - Use more selective log levels
- Consider using a router to only log specific components in detail
Issue: Database logging errors
Solution:
- Ensure your connection string is properly formatted
- Check database permissions
- Verify that the SQLite provider is correctly installed
- Add error handling in your
DatabaseLogger
Issue: Missing context information
Solution:
- Verify that context is added before the logs are generated
- Check if the context is being cleared unintentionally
- Ensure correlation IDs are properly propagated
Issue: Missing or incomplete data change logs
Solution:
- Ensure properties are correctly marked with
[BeforeChange] - Verify that object types are compatible for comparison
- Check for
ExcludeFromComparisonattributes - Increase
MaxComparisonDepthif nested properties aren't being compared
To troubleshoot logger problems:
-
Enable diagnostic logging:
LogManager.Configure(config => { // Add a console logger to see immediate output config.Logger = new CompositeLogger(new[] { config.Logger, new ConsoleLogger(LogLevel.Trace, useColors: true) }); });
-
Verify logger configuration:
var config = LogManager.GetConfiguration(); Console.WriteLine($"Logger type: {config.Logger?.GetType().Name}"); Console.WriteLine($"Environment: {config.Environment}"); Console.WriteLine($"Min level: {GetMinimumLogLevel(config.Logger)}"); private static LogLevel GetMinimumLogLevel(ILogger logger) { // Try to find minimum level foreach (LogLevel level in Enum.GetValues(typeof(LogLevel))) { if (logger.IsEnabled(level)) return level; } return LogLevel.Critical; }
-
Test each logger directly:
// Test each logger component directly var fileLogger = new FileLogger(LogLevel.Trace, "test.log"); fileLogger.Log(new LogEntry { Timestamp = DateTime.Now, Level = LogLevel.Info, Message = "Test entry" });
This usage guide provides comprehensive instructions for effectively implementing AnnotationLogger in your applications. By following these guidelines, you'll be able to take full advantage of the framework's powerful features while maintaining a clean, maintainable codebase.