Replies: 8
So, I have installed this plugin, and connected an external Redis server that I host.
This redis server is running in an uncommon TCP port, which then redirects the traffic of that port to the usual redis port 6379 over TCP. Technically the site connects just fine , but whenever I activate the cache, the performance tanks drastically. In the wp-admin panel, and in the main website.
I don’t think that using socat and tcp instead of socket is the thing that makes such a massive performance impact, since anyway the cache is barely growing.
This is the status of the plugin
Status: Conectado
Client: Predis (v2.1.2)
Drop-in: Valid
Disabled: No
Ping: PONG
Errors: []
PhpRedis: Not loaded
Relay: Not loaded
Predis: 2.1.2
Credis: Not loaded
PHP Version: 7.4.33
Plugin Version: 2.5.4
Redis Version: 6.0.16
Multisite: No
Metrics: Enabled
Metrics recorded: 1
Filesystem: Writable
Global Prefix: "wp_"
Blog Prefix: "wp_"
Timeout: 3
Read Timeout: 3
Retry Interval:
WP_REDIS_HOST: "MY_HOST"
WP_REDIS_PORT: 51820
WP_REDIS_TIMEOUT: 3
WP_REDIS_READ_TIMEOUT: 3
WP_REDIS_MAXTTL: 86400
WP_REDIS_PLUGIN_PATH: "/home2/HOSTING/public_html/MYSITE.com/wp-content/plugins/redis-cache"
WP_REDIS_SELECTIVE_FLUSH: true
WP_REDIS_PASSWORD: ••••••••
Global Groups: [
"blog-details",
"blog-id-cache",
"blog-lookup",
"global-posts",
"networks",
"rss",
"sites",
"site-details",
"site-lookup",
"site-options",
"site-transient",
"users",
"useremail",
"userlogins",
"usermeta",
"user_meta",
"userslugs",
"redis-cache",
"blog_meta",
"image_editor",
"network-queries",
"site-queries",
"theme_files",
"translation_files",
"user-queries",
"code_snippets"
]
Ignored Groups: [
"counts",
"plugins",
"theme_json",
"themes"
]
Unflushable Groups: []
Groups Types: {
"blog-details": "global",
"blog-id-cache": "global",
"blog-lookup": "global",
"global-posts": "global",
"networks": "global",
"rss": "global",
"sites": "global",
"site-details": "global",
"site-lookup": "global",
"site-options": "global",
"site-transient": "global",
"users": "global",
"useremail": "global",
"userlogins": "global",
"usermeta": "global",
"user_meta": "global",
"userslugs": "global",
"redis-cache": "global",
"blog_meta": "global",
"image_editor": "global",
"network-queries": "global",
"site-queries": "global",
"theme_files": "global",
"translation_files": "global",
"user-queries": "global",
"counts": "ignored",
"plugins": "ignored",
"theme_json": "ignored",
"code_snippets": "global",
"themes": "ignored"
}
Drop-ins: [
"advanced-cache.php v by ",
"Redis Object Cache Drop-In v2.5.4 by Till Krüss"
]
If I monitor redis usage, it seems to be processing a lot of woocommerce queries I can’t paste everything but this is part of what i get with redis-cli MONITOR, so it seems allright, wordpress is being processed
"wp:options:aioseo_options_internal_localized"
1743155668.415724 [0 127.0.0.1:51968] "GET" "wp:options:notoptions"
1743155668.449887 [0 127.0.0.1:51968] "GET" "wp:options:aioseo_options_localized"
1743155668.496202 [0 127.0.0.1:51968] "GET" "wp:options:uninstall_plugins"
1743155668.528388 [0 127.0.0.1:51968] "GET" "wp:options:breeze_inherit_settings"
1743155668.639770 [0 127.0.0.1:51968] "GET" "wp:transient:jetpack_autoloader_plugin_paths"
1743155668.692106 [0 127.0.0.1:51968] "GET" "wp:options:woocommerce_attribute_lookup_optimized_updates"
1743155668.725288 [0 127.0.0.1:51968] "GET" "wp:options:woocommerce_custom_orders_table_data_sync_enabled"
1743155668.762443 [0 127.0.0.1:51968] "GET" "wp:options:woocommerce_custom_orders_table_background_sync_mode"
1743155668.794895 [0 127.0.0.1:51968] "GET" "wp:options:woocommerce_hpos_datastore_caching_enabled"
1743155668.878655 [0 127.0.0.1:51968] "GET" "wp:options:wc_feature_woocommerce_brands_enabled"
1743155668.913744 [0 127.0.0.1:51968] "GET" "wp:code_snippets:active_snippets_global_single-use_front-end_wp_snippets"
1743155668.945857 [0 127.0.0.1:51968] "GET" "wp:users:1"
1743155668.977459 [0 127.0.0.1:51968] "GET" "wp:user_meta:1"
1743155669.101243 [0 127.0.0.1:51968] "MGET" "wp:options:auth_key" "wp:options:auth_salt" "wp:options:secure_auth_key" "wp:options:secure_auth_salt" "wp:options:logged_in_key" "wp:options:logged_in_salt" "wp:options:nonce_key" "wp:options:nonce_salt" "wp:options:secret_key"
1743155669.135100 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_rules"
1743155669.166190 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_additional"
1743155669.197344 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_queue"
1743155669.228374 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_viewport"
1743155669.259639 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_finclude"
1743155669.290896 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_rtimelimit"
1743155669.322127 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_noptimize"
1743155669.353586 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_debug"
1743155669.384662 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_key"
1743155669.415845 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_keyst"
1743155669.446905 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_loggedin"
1743155669.478144 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_forcepath"
1743155669.509301 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_deferjquery"
1743155669.540355 [0 127.0.0.1:51968] "GET" "wp:options:autoptimize_ccss_domain"
This is my server status
Memory
used_memory:4305776
used_memory_human:4.11M
used_memory_rss:15163392
used_memory_rss_human:14.46M
used_memory_peak:9385464
used_memory_peak_human:8.95M
used_memory_peak_perc:45.88%
used_memory_overhead:860368
used_memory_startup:809752
used_memory_dataset:3445408
used_memory_dataset_perc:98.55%
allocator_allocated:4670528
allocator_active:5111808
allocator_resident:7999488
total_system_memory:8164917248
total_system_memory_human:7.60G
used_memory_lua:41984
used_memory_lua_human:41.00K
used_memory_scripts:0
used_memory_scripts_human:0B
number_of_cached_scripts:0
maxmemory:2147483648
maxmemory_human:2.00G
maxmemory_policy:allkeys-lru
allocator_frag_ratio:1.09
allocator_frag_bytes:441280
allocator_rss_ratio:1.56
allocator_rss_bytes:2887680
rss_overhead_ratio:1.90
rss_overhead_bytes:7163904
mem_fragmentation_ratio:3.57
mem_fragmentation_bytes:10919600
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:0
mem_aof_buffer:0
mem_allocator:jemalloc-5.2.1
active_defrag_running:0
lazyfree_pending_objects:0
1) "maxmemory"
2) "2147483648"
1) "maxmemory-policy"
2) "allkeys-lru"
It stays at around 5mb and never gets above that, so nothing is being filled.
If I do a scan, and check the ttl of the objects everything seems fine too, meaning its persistent
redis-cli -a 'XXXX' --scan | head -n 20 -----> wp:posts:7192....
redis-cli -a 'XXXX' ttl wp:posts:7192 ---> I mostly get (-1)
To further test, I added a script for cache hit in my theme
add_action('init', function () {
if (function_exists('wp_cache_get')) {
$test_key = 'chatgpt_test_key';
$test_value = 'hello_redis';
wp_cache_set($test_key, $test_value);
$fetched = wp_cache_get($test_key);
if ($fetched === $test_value) {
error_log("[Redis Test] HIT ✅ - Value: $fetched");
} else {
error_log("[Redis Test] MISS ❌");
}
}
});
tail -f wp-content/debug.log | grep "Redis Test"
And I do get some hits in my logs, so technically its connecting and saving some things ,but seems that only hello_redis
/home2/XXXX/public_html/XXXX.com/wp-includes/functions.php on line 6114
[28-Mar-2025 10:35:23 UTC] [Redis Test] HIT ✅ - Value: hello_redis
[28-Mar-2025 10:35:23 UTC] [Redis Test] HIT ✅ - Value: hello_redis
[28-Mar-2025 10:35:41 UTC] PHP Notice: Function _load_textdomain_just_in_time was called <strong>incorrectly</strong>. Translation loading for the <code>google-analytics-for-wordpress</code> domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the <code>init</code> action or later. Please see <a href="https://developer.wordpress.org/advanced-administration/debug/debug-wordpress/">Debugging in WordPress</a> for more information. (This mes
Im also using Breeze cache, and jetpack cdn for images, but that doesn’t seem to be the culprit of the issues.
What might be the culprit of the massive slowdown once I activate redis object cache ?