When working with Mikrotik RouterOS scripts (particularly in older versions like 5.14), network-dependent operations like /tool fetch
can cause script execution to hang when encountering connection failures. This becomes problematic when you need scripts to continue running regardless of whether a remote resource is temporarily unavailable.
The fundamental problem lies in how RouterOS handles synchronous operations. Unlike modern programming environments that provide robust exception handling, RouterOS scripts (especially in version 5.14) lack traditional try-catch mechanisms. When /tool fetch
fails, it doesn't just return an error code - it can completely stall script execution.
Here's an effective pattern that combines global variables with asynchronous execution:
# Main script controller
:global done
:global url "google.com"
:set done false
# Execute fetch in separate context
:execute safe-fetch
# Implement timeout mechanism
:local counter 0
:while ( $done != true && $counter < 10 ) do={
:set counter ($counter+1)
:delay 0.2
}
# Result evaluation
if ($done = "true") do={
:put "Fetch completed successfully"
} else={
:put "Fetch operation timed out"
}
Create a separate script named "safe-fetch" with this content:
:global done
:global url
/tool fetch $url
:set done true
- The
:execute
command runs the fetch operation in a separate context - Global variables act as communication channels between contexts
- The timeout counter prevents infinite waiting (adjust 10 attempts/0.2s as needed)
- This pattern works around the blocking nature of fetch operations
For more robust implementations consider:
# Enhanced version with logging
:global fetchStatus
:global fetchTarget
:set fetchTarget "http://example.com/api"
:set fetchStatus "pending"
:execute advanced-fetch
:local timeout 30 # seconds
:local interval 0.5
:local attempts ($timeout / $interval)
:while ( $fetchStatus = "pending" && $attempts > 0 ) do={
:set attempts ($attempts - 1)
:delay $interval
}
:if ($fetchStatus = "success") do={
:log info "Fetch completed"
# Process successful fetch
} else={
:log warning "Fetch failed after $timeout seconds"
# Fallback logic
}
For RouterOS v6.x and newer, consider these additional options:
- Using
/tool fetch
withasynchronous=yes
parameter - Implementing error callback functions
- Leveraging the scheduler for retry logic
When implementing network operations in Mikrotik scripts:
- Always assume network operations may fail
- Implement timeout mechanisms for all blocking operations
- Consider using separate scripts for critical operations
- Log operation outcomes for debugging
- Test with intentionally broken connections during development
When working with RouterOS scripts (particularly on older versions like 5.14), you'll encounter situations where the /tool fetch
command fails silently and hangs your script execution. This commonly happens when:
- The target URL is temporarily unavailable
- DNS resolution fails
- Network connectivity issues exist
- The remote server times out
RouterOS scripting lacks traditional try-catch mechanisms found in modern programming languages. The /tool fetch
command doesn't return standard error codes that you can check in subsequent script logic.
Here's an improved version of the workaround that addresses the hanging issue:
# Main script
:global fetchSuccess
:global targetURL "http://example.com/api"
/system script add name="safeFetch" source=\
":global fetchSuccess
:global targetURL
/tool fetch url=\"\$targetURL\" dst-path=tempfile
:set fetchSuccess true"
# Execution wrapper
:set fetchSuccess false
:execute safeFetch
# Timeout implementation
:local timeout 5 # seconds
:local interval 0.2
:local attempts ($timeout / $interval)
:local counter 0
:while (($fetchSuccess != true) && ($counter < $attempts)) do={
:set counter ($counter + 1)
:delay $interval
}
:if ($fetchSuccess) do={
:put "Fetch operation completed successfully"
/file remove tempfile
} else={
:put "Fetch timed out after $timeout seconds"
/system script remove safeFetch
}
For production environments, consider these enhancements:
# Add error logging
:local logFile "fetch_errors.log"
:if ($fetchSuccess != true) do={
/file set $logFile contents=([/file get $logFile contents] . "\n[[:datetime]] Failed to fetch $targetURL")
}
# Multiple URL fallback
:global backupURLs {"http://backup1.example.com","http://backup2.example.com"}
:foreach url in=$backupURLs do={
:set targetURL $url
:execute safeFetch
:if ($fetchSuccess) do={ :break }
}
This solution works across RouterOS versions, but newer versions (6.x+) offer additional options:
timeout
parameter in/tool fetch
- Improved script error handling
- Better logging capabilities