Making Phishing Pages Undetectable using ANTIBOTS — 3 Easy Steps! (Source Included)

1 year ago 91
BOOK THIS SPACE FOR AD
ARTICLE AD

Phishing sites are detected because they don’t look like the real website they’re pretending to be and they often contain malicious code that triggers security alerts. Additionally, many web browsers, email clients, and extensions are designed to report phishing attempts and warn users of potential dangers. Additionally, web hosts, internet service providers, and other organizations have systems in place to detect and block phishing attempts using specific IP addresses or domains that have been used for malicious purposes in the past. Search engine bots, Spam bots, Scraping bots plays an important role to flag the phishing page.

Also Read : Best Ultimate Guide to Reverse Engineering Android Apps

Phishing sites are marked “red” because they are often identified as malicious or suspicious by anti-virus software, browsers or websites that monitor this activity. Some reasons to report a phishing page are:

Suspicious URLs: Phishing pages often have different URLs than the original website or have characters or symbols added to them that make them look suspicious.Malicious content: Phishing pages may contain malicious code or links that may compromise the security of a user’s device or steal sensitive information.Bad reputation: If your page has previously reported or detected a phishing page, it may have a bad reputation, which is why it is marked in red.Blacklists: Phishing pages may also be on blacklists maintained by information security organizations, browsers or other companies. These blacklists protect users from accessing malicious websites.

example of a flagged phishing page

Yes, Below are 3 easy steps hacker follows to block the bots on phishing page:

🦋 STEP 001 — BLOCK UNWANTED COUNTRY TRAFFIC

To make sure the phishing page is only accessible to desired country traffic, the hacker can add checkpoint for checking if the visitor’s IP address belongs to the specified country in the array; if not then just block it by showing some error. In some cases the hacker can even add logging features to log the IP address, date, and reason for access denial (country not allowed or VPN/proxy detected) to a text file. In this example even I added a logging feature which stores all these to a file named “access_denied.log”.

📑 Source Code for STEP 001: (PHP)

<?php
// XIT was here!
// Get the user's IP address
$ip_address = $_SERVER['REMOTE_ADDR'];
// Make a request to the IP Geolocation API
$url = 'https://api.ipgeolocation.io/ipgeo?apiKey=YOUR_API_KEY&ip=' . $ip_address;
$response = file_get_contents($url);
$result = json_decode($response, true);
// Check if the user is from a banned country
$banned_countries = array('CN', 'RU', 'PK');
if (in_array($result['country_code'], $banned_countries)) {
$log = "[" . date("Y-m-d H:i:s") . "] Access denied for IP " . $ip_address . ": country not allowed\n";
file_put_contents('access_denied.log', $log, FILE_APPEND);
die('Access denied: country not allowed');
}
// Check if the user is using a VPN or proxy
if ($result['is_proxy'] || $result['is_vpn']) {
$log = "[" . date("Y-m-d H:i:s") . "] Access denied for IP " . $ip_address . ": VPN or proxy detected\n";
file_put_contents('access_denied.log', $log, FILE_APPEND);
die('Access denied: VPN or proxy detected');
}
// Continue with the rest of the phishing page code

In the above example, we get the visitor’s IP address and store in $ip_address variable, then we make a GET request on the API in order to get country info (make sure to add your API key by replacing YOUR_API_KEY, which you will get on ipgeolocation.io). Next, we check into the response if the user’s country exists in the array of blocked countries, if yes then it will log it to the log file and display error to the user. If not then it will continue to check if its either VPN or Proxy.. If its any of them then it will block and log the data, else it will let the user access the phishing page.

🦋 STEP 002 — BLOCK BAD IP TRAFFIC

To make sure the phishing page is only accessible by whitelisted IPs (normal users) and not by any Blacklisted IPs which trying to mess with the URL, it is required to block certain bad IPS.

Following are the resources to get daily updated bots lists for free which you can include in your Antibot database:

lists.blocklist.de

www.projecthoneypot.org

www.threatminer.org

IBM X-Force Exchange

exchange.xforce.ibmcloud.com

iplists.firehol.org

https://raw.githubusercontent.com/stamparm/ipsum/master/ipsum.txt

📑 Source Code for STEP 002: (PHP)

<?php
// XIT was here!
// Define an array to store banned IP addresses
$banned_ips = array('1.2.3.4', '5.6.7.8');
// Get the visitor's IP address
$visitor_ip = $_SERVER['REMOTE_ADDR'];
// Check if the visitor's IP address is in the banned IPs list
if (in_array($visitor_ip, $banned_ips)) {
// If the visitor's IP is banned, log the attempted access
$log = "Access attempt from banned IP address: " . $visitor_ip . " at " . date("Y-m-d H:i:s") . "\n";
error_log($log, 3, "access_log.txt");

// Display an error message
die('Access Denied - Your IP address is banned');
}

// ---
// OR YOU CAN EVEN ADD FROM A FILE
// ---
// Load the list of banned IPs from a file or database
$banned_ips = load_banned_ips();
// Function to load the list of banned IPs
function load_banned_ips() {
// Code to load the banned IPs from a file or database goes here
// ...
return $banned_ips;
}
// If the visitor's IP is not banned, continue processing the phishing page
// ...
?>

In the above example, we create a array to store the list of all blacklisted IPs, next we store the visitors IP into a $visitor_ip variable and then check if the visitor’s IP exists in the array of blacklisted IPs. If yes then log it to file and display error to the user. If not then continue with the phishing page code. Similarly, I’ve added one more function below with loads blacklisted IPs form an external file instead of an array.

You can even do the same thing using .htaccess files. .htaccess files can be used to block blacklisted IP addresses by adding rules to the file. This restrict access to the phishing page based on the visitor’s IP address. Below is the source with explanation:

📑 Source Code for STEP 002: (.htaccess)

# XIT was here!
# Block a specific IP address
Order Allow,Deny
Deny from 1.2.3.4
Allow from all
# Block a range of IP addresses
Order Allow,Deny
Deny from 1.2.3.0/24
Allow from all

In the above 1st example, access from the IP address 1.2.3.4 is denied, while all other IP addresses are allowed.

In the 2nd example, access from the IP range 1.2.3.0/24 (which covers the IP addresses 1.2.3.1 through 1.2.3.255) is denied, while all other IP addresses are allowed.

Hacker can add multiple Deny from rules to block multiple IP addresses or ranges. As shown below:

# XIT was here!
# Block multiple IP addresses and ranges
Order Allow,Deny
Deny from 1.2.3.4
Deny from 1.2.3.0/24
Deny from 5.6.7.8
Allow from all

These rules should be added to the .htaccess file in the root directory of web server, or in a directory that consists the phishing page which we want to protect. Note that the .htaccess file must be enabled on web server for these rules to work.

🦋 STEP 003 — BLOCK BAD USERAGENT TRAFFIC

To make sure the phishing page is only accessible by humans and not by any bots which trying to crawl or scan the URL, it is required to block certain bad UserAgents. Below are some exaples of bot UserAgents:

Googlebot: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)Bingbot: Mozilla/5.0 (compatible; Bingbot/2.0; +http://www.bing.com/bingbot.htm)Yahoo! Slurp: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)Baidu Spider: Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)Yandex Bot: Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)Sogou Spider: Mozilla/5.0 (compatible; Sogou Spider/3.0; +http://www.sogou.com/docs/help/webmasters.htm#07)Facebot: facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)Tweetbot: Twitterbot/1.0AppleBot: AppleBot/0.1 (+https://www.apple.com/go/applebot)DuckDuckBot: DuckDuckBot/1.0; (+http://duckduckgo.com/duckduckbot.html)Exabot: Mozilla/5.0 (compatible; Exabot/3.0; +http://www.exabot.com/go/robot)Facebot: Facebotia_archiver: ia_archiver (+http://www.alexa.com/site/help/webmasters; crawler@alexa.com)TelegramBot: TelegramBot (like TwitterBot)

You can block these using either PHP , .htaccess or even robots.txt file. Following are the sources for each:

📑 Source Code for STEP 003: (PHP)

<?php
// XIT was here!
$user_agent = $_SERVER['HTTP_USER_AGENT'];// List of bot user agent keywords or patterns
$bots = array("bot", "crawler", "spider");
// Loop through the list of bots and check if the user agent contains a keyword or pattern
foreach ($bots as $bot) {
if (stripos($user_agent, $bot) !== false) {
// Block the bot by returning a 403 Forbidden response
header("HTTP/1.0 403 Forbidden");
exit;
}
}
?>

In the above example, we get the visitors useragent and store it in a variable then compare it with each list of bots present in the array. If the useragent contains any of the blacklisted work then it will block the user by accessing the page. If not then it will let the user visit the page. Below i’ll attach the shorter version for this in PHP.

<?php
// XIT was here!
if (preg_match('/Googlebot|Bingbot|Yahoo! Slurp|Baidu Spider|Yandex Bot|Sogou Spider|Exabot/i', $_SERVER['HTTP_USER_AGENT'])) {
header('HTTP/1.0 403 Forbidden');
exit();
}
?>

📑 Source Code for STEP 003: (.htaccess)

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} Googlebot|Bingbot|Yahoo! Slurp|Baidu Spider|Yandex Bot|Sogou Spider|Exabot [NC]
RewriteRule .* - [F,L]

In this example, the RewriteCond directive is used to match the user agent string against a list of known crawler user agents. If a match is found, the request is blocked by the RewriteRule directive, which returns a ‘403 Forbidden’ HTTP response.

You can even do this using the “User-agent” directive in the robots.txt file. The User-agent directive allows you to specify which user agents (bots) are allowed or disallowed from accessing the page. Below I’ll attach all the known bots UA list:

📑 Source Code for STEP 003: (robots.txt)

User-agent: Alexibot
Disallow: /
User-agent: Aqua_Products
Disallow: /
User-agent: asterias
Disallow:/
User-agent: b2w/0.1
Disallow: /
User-agent: BackDoorBot/1.0
Disallow:/
User-agent: Black Hole
Disallow:/
User-agent: BlowFish/1.0
Disallow:/
User-agent: Bookmark search tool
Disallow: /
User-agent: BotALot
Disallow:/
User-agent: BotRightHere
Disallow: /
User-agent: BuiltBotTough
Disallow:/
User-agent: Bullseye/1.0
Disallow:/
User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /
User-agent: BunnySlippers
Disallow:/
User-agent: Cegbfeieh
Disallow:/
User-agent: CheeseBot
Disallow:/
User-agent: CherryPicker
Disallow:/
User-agent: CherryPickerElite/1.0
Disallow:/
User-agent: CherryPickerSE/1.0
Disallow:/
User-agent: Copernic
Disallow: /
User-agent: CopyRightCheck
Disallow:/
User-agent: cosmos
Disallow:/
User-agent: contact me on telegram @ually
Disallow: /
User-agent: Crescent
Disallow:/
User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow:/
User-agent: DittoSpyder
Disallow:/
User-agent: EmailCollector
Disallow:/
User-agent: EmailSiphon
Disallow:/
User-agent: EmailWolf
Disallow:/
User-agent: EroCrawler
Disallow:/
User-agent: ExtractorPro
Disallow:/
User-agent: FairAd Client
Disallow: /
User-agent: Flaming AttackBot
Disallow: /
User-agent: Foobot
Disallow:/
User-agent: Gaisbot
Disallow: /
User-agent: GetRight/4.2
Disallow: /
User-agent: Harvest/1.5
Disallow:/
User-agent: hloader
Disallow:/
User-agent: httplib
Disallow:/
User-agent: HTTrack 3.0
Disallow: /
User-agent: humanlinks
Disallow:/
User-agent: InfoNaviRobot
Disallow:/
User-agent: Iron33/1.0.2
Disallow: /
User-agent: JennyBot
Disallow:/
User-agent: Kenjin Spider
Disallow:/
User-agent: Keyword Density/0.9
Disallow:/
User-agent: larbin
Disallow: /
User-agent: LexiBot
Disallow:/
User-agent: libWeb/clsHTTP
Disallow:/
User-agent: LinkextractorPro
Disallow:/
User-agent: LinkScan/8.1a Unix
Disallow:/
User-agent: LinkWalker
Disallow:/
User-agent: LNSpiderguy
Disallow:/
User-agent: lwp-trivial
Disallow:/
User-agent: lwp-trivial/1.34
Disallow:/
User-agent: Mata Hari
Disallow:/
User-agent: Microsoft URL Control
Disallow: /
User-agent: Microsoft URL Control – 5.01.4511
Disallow:/
User-agent: Microsoft URL Control – 6.00.8169
Disallow:/
User-agent: MIIxpc
Disallow:/
User-agent: MIIxpc/4.2
Disallow:/
User-agent: Mister PiX
Disallow:/
User-agent: moget
Disallow:/
User-agent: moget/2.1
Disallow:/
User-agent: mozilla/4
Disallow:/
User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000)
Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows ME)
Disallow:/
User-agent: mozilla/5
Disallow:/
User-agent: MSIECrawler
Disallow: /
User-agent: NetAnts
Disallow:/
User-agent: NICErsPRO
Disallow:/
User-agent: Offline Explorer
Disallow:/
User-agent: Openbot
Disallow: /
User-agent: Openfind
Disallow:/
User-agent: Openfind data gathere
Disallow:/
User-agent: Oracle Ultra Search
Disallow: /
User-agent: PerMan
Disallow: /
User-agent: ProPowerBot/2.14
Disallow:/
User-agent: ProWebWalker
Disallow:/
User-agent: psbot
Disallow: /
User-agent: Python-urllib
Disallow: /
User-agent: QueryN Metasearch
Disallow:/
User-agent: Radiation Retriever 1.1
Disallow: /
User-agent: RepoMonkey
Disallow:/
User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow:/
User-agent: RMA
Disallow:/
User-agent: searchpreview
Disallow: /
User-agent: SiteSnagger
Disallow:/
User-agent: SpankBot
Disallow:/
User-agent: spanner
Disallow:/
User-agent: suzuran
Disallow:/
User-agent: Szukacz/1.4
Disallow:/
User-agent: Teleport
Disallow:/
User-agent: TeleportPro
Disallow:/
User-agent: Telesoft
Disallow:/
User-agent: The Intraformant
Disallow:/
User-agent: TheNomad
Disallow:/
User-agent: TightTwatBot
Disallow:/
User-agent: Titan
Disallow:/
User-agent: toCrawl/UrlDispatcher
Disallow:/
User-agent: True_Robot
Disallow:/
User-agent: True_Robot/1.0
Disallow:/
User-agent: turingos
Disallow:/
User-agent: TurnitinBot
Disallow: /
User-agent: TurnitinBot/1.5
Disallow: /
User-agent: URL Control
Disallow: /
User-agent: UALLY Pro
Disallow: /
User-agent: URL_Spider_Pro
Disallow: /
User-agent: URLy Warning
Disallow:/
User-agent: VCI
Disallow:/
User-agent: VCI WebViewer VCI WebViewer Win32
Disallow:/
User-agent: Web Image Collector
Disallow:/
User-agent: WebAuto
Disallow:/
User-agent: WebBandit
Disallow:/
User-agent: WebBandit/3.50
Disallow:/
User-agent: WebCapture 2.0
Disallow: /
User-agent: WebCopier
Disallow:/
User-agent: WebCopier v.2.2
Disallow: /
User-agent: WebCopier v3.2a
Disallow: /
User-agent: WebEnhancer
Disallow:/
User-agent: WebmasterWorldForumBot
Disallow:/
User-agent: WebSauger
Disallow:/
User-agent: Website Quester
Disallow:/
User-agent: Webster Pro
Disallow:/
User-agent: WebStripper
Disallow:/
User-agent: WebZip
Disallow:/
User-agent: WebZip/4.0
Disallow:/
User-agent: WebZIP/4.21
Disallow: /
User-agent: WebZIP/5.0
Disallow: /
User-agent: Wget
Disallow:/
User-agent: Wget/1.5.3
Disallow:/
User-agent: Wget/1.6
Disallow:/
User-agent: WWW-Collector-E
Disallow:/
User-agent: Xenu’s
Disallow:/
User-agent: Xenu’s Link Sleuth 1.1c
Disallow:/
User-agent: Zeus
Disallow:/
User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow:/
User-agent: Zeus Link Scout
Disallow: /
User-agent: Googlebot
Disallow: /
User-agent: googlebot-image
Disallow: /
User-agent: googlebot-mobile
Disallow: /
User-agent: MSNBot
Disallow: /
User-agent: Slurp
Disallow: /
User-agent: Teoma
Disallow: /
User-agent: Gigabot
Disallow: /
User-agent: Robozilla
Disallow: /
User-agent: Nutch
Disallow: /
User-agent: ia_archiver
Disallow: /
User-agent: baiduspider
Disallow: /
User-agent: naverbot
Disallow: /
User-agent: yeti
Disallow: /
User-agent: yahoo-mmcrawler
Disallow: /
User-agent: psbot
Disallow: /
User-agent: yahoo-blogs/v3.9
Disallow: /

Also Read : Top Picks: The Best Cyber Security Search Engines of the Year 2022

Read Entire Article