High Cpu Usage - Spammed by meta?

I seem to be getting spammed by these

57.141.16.24 - - [20/Apr/2026:04:00:50 +0800] "GET /cart/?remove_item=bfd2308e9e75263970f8079115edebbd&_wpnonce=87af8e445c&add-to-cart=4571 HTTP/2.0" 200 1452108 "-" "meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)"
57.141.16.45 - - [20/Apr/2026:04:00:50 +0800] "GET /cart/?remove_item=a927541d74113c53810fb0b383a649ac&_wpnonce=c5368d0313&add-to-cart=4571 HTTP/2.0" 200 1452108 "-" "meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)"
57.141.16.92 - - [20/Apr/2026:04:00:51 +0800] "GET /cart/?remove_item=752d2c9ecfe079e5e5f3539f4d750e5c&_wpnonce=54da3d2995&add-to-cart=4619 HTTP/2.0" 200 1452212 "-" "meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)"
57.141.16.100 - - [20/Apr/2026:04:00:51 +0800] "GET /cart/?remove_item=752d2c9ecfe079e5e5f3539f4d750e5c&_wpnonce=c7f9ae23ba&add-to-cart=4521 HTTP/2.0" 200 1452118 "-" "meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)"

I have no idea why, in 1 week this particular user has consumed 5TB of data. And by analysis it seems to be spamming from facebook. Any idea what i should do?

I have tried disabling the meta pixel does not seem to change anything.

If that’s the real facebook crawler, you could add this to your robots.txt (it should respect it):

[other rules]

User-agent: meta-externalagent  
Disallow: /cart/  
Disallow: /checkout/  
Disallow: /*add-to-cart=*  
Disallow: /*remove_item=*  
Disallow: /wp-admin

5TB are a lot for a stupid crawler, I would have blocked the IP range 57.141.16.0/24 days ago.

2 Likes

I ended up blocking the IP range and updating the robots.txt, and that seemed to have stopped it for now.. I saw on reddit that recently theres an uptick of meta crawler for AI. But this is the most aggressive i have seen in sometime, maybe it was broken.

2 Likes

Check my posts here. I had my servers freezing due to several AI crawlers, and them not respecting the robots file. I ended up blocking 10K+ bot IP’s and ranges.

link please

Topic
I had issues, and @sahsanu directed me towards resolution as well.