top of page

Meals & nutrition

Public·10 members
Caleb Lewis
Caleb Lewis


When performing security testing against an asset, it is vital to have high quality wordlists for content and subdomain discovery. This website provides you with wordlists that are up to date and effective against the most popular technologies on the internet.


When I now go to with Google Chrome I get an error message: "Error 404 Vhost unknown."This error message comes from Gandi, not from my other domain name registrar. So I conclude that the CNAME entry has taken effect.

So I did that. Since I cannot enter these lines directly but have to use some input web interface, I entered the TXT entry value once with and once without quotation marks, and both for the subdomain (entered into the "host" input field) and for the root domain (entered into the "host" input field), just to be sure, because I am not allowed to enter "@" or "*" in that input field.

I also flushed CNAME and TXT entries several times in Google's public DNS (, through the interface at -dns/cache, for both and And I am using the Google DNS server as my primary DNS on this machine.

When I go to with Google Chrome, I still get the Error "Error 404 Vhost unknown." - which is probably to be expected, judging from the DNS digging results. This error message comes from Gandi, so at least the CNAME (either to or still to has taken effect there. Only the TXT entry, which should help to authorize domain ownership, is still not recognized, I guess.

Is there something wrong with my TXT entry inputs? As I said, I added them with both (which is probably not in line with Gandi's instructions) as well as to (which would probably be in line with Gandi's instructions), simply because I could not enter "@" or "*" instead, and wanted to be sure that I included the entry that is expected. And I also added them both twice, once with quotation marks included and once with quotation marks excluded, assuming that only the correct entry would be picked up by Gandi.

EDIT: It works now. Not quite sure, but my hunch is that the extra TXT entry that I added for was somehow in conflict with the CNAME for The manual said to add the TXT only for and not, but as I explained, I did both "just to be sure". I'm not quite sure that this was what did the trick, but this was the last thing I changed, and now it works.

The second step for domains not at Gandi is adding the CNAME for the subdomain and the TXT at the domain apex: @ IN TXT is equivalent to IN TXT. As you could see from the manual, the form of this TXT record seems to be subdomain=hash. If your given record literally had test=s0m3r4nD0mG!bB3ri$hStr1n, it was probably ment for instead of your In that case you need to start from the beginning by adding the exact subdomain you are planning to use.

If you add a TXT record on a subdomain that already has CNAME record, it's normal that it doesn't work: it'll show the TXT from the canonical name, instead, just like in your results. If a hostname has a CNAME record, it must not have other resource records of other type. Care to know why? I have an answer, and AndrewB even more detailed on a canonical question.

I just mentioned above that Google handles robots.txt files by subdomain and protocol. For example, a site can have one robots.txt file sitting on the non-www version, and a completely different one sitting on the www version. I have seen this happen several times over the years while helping clients and I just surfaced it again recently.

Beyond www and non-www, a site can have a robots.txt file sitting at the https version of a subdomain and then also at the http version of that subdomain. So, similar to what I explained above, there could be multiple robots.txt files with different instructions based on protocol.

While performing a crawl analysis and audit recently on a publisher site, I noticed that some pages being blocked by robots.txt were actually being crawled and indexed. I know that Google 100% obeys robots.txt instructions for crawling so this was clearly a red flag.

When checking the robots.txt file manually for the site, I saw one set of instructions on the non-www version which were limited. Then I started to manually check other versions of the site (by subdomain and protocol) to see if there were any issues.

When using that tool, you can view previous robots.txt files that Google has seen. And as you can guess, I saw both robots.txt files there. So yes, Google was officially seeing the second robots.txt file.

Needless to say, I quickly emailed my client with the information, screenshots, etc., and told them to remove the second robots.txt file and 301 redirect the www version to the non-www version. Now when Google visits the site and checks the robots.txt file, it will consistently see the correct set of instructions.

As a quick second example, a site owner contacted me a few years ago that was experiencing a drop in organic search traffic and had no idea why. After digging in, I decided to check the various versions of the site by protocol (including the robots.txt files for each version).

To dig into this situation, there are several tools that you can use beyond manually checking the robots.txt files per subdomain and protocol. The tools can also help surface the history of robots.txt files seen across a site.

I add ads.txt to my main site gogosgrouplaw.grBut i want to use ads to my blog thegreeklawyer.gogoagrouplaw.grI add the subdomain to the adsense panel.Am i ok??The plugin says that i dont have ads.txt on the subdomain and this to true.What i have to do?? Add the same ads.txt to the subdomain??Please help i dont want to loose google ads

Subdomain takeovers are a common, high-severity threat for organizations that regularly create, and delete many resources. A subdomain takeover can occur when you have a DNS record that points to a deprovisioned Azure resource. Such DNS records are also known as "dangling DNS" entries. CNAME records are especially vulnerable to this threat. Subdomain takeovers enable malicious actors to redirect traffic intended for an organization's domain to a site performing malicious activity.

When a DNS record points to a resource that isn't available, the record itself should have been removed from your DNS zone. If it hasn't been deleted, it's a "dangling DNS" record and creates the possibility for subdomain takeover.

Dangling DNS entries make it possible for threat actors to take control of the associated DNS name to host a malicious website or service. Malicious pages and services on an organization's subdomain might result in:

Cookie harvesting from unsuspecting visitors - It's common for web apps to expose session cookies to subdomains (*, consequently any subdomain can access them. Threat actors can use subdomain takeover to build an authentic looking page, trick unsuspecting users to visit it, and harvest their cookies (even secure cookies). A common misconception is that using SSL certificates protects your site, and your users' cookies, from a takeover. However, a threat actor can use the hijacked subdomain to apply for and receive a valid SSL certificate. Valid SSL certificates grant them access to secure cookies and can further increase the perceived legitimacy of the malicious site.

Phishing campaigns - Authentic-looking subdomains might be used in phishing campaigns. This is true for malicious sites and for MX records that would allow the threat actor to receive emails addressed to a legitimate subdomain of a known-safe brand.

Review your DNS zones and identify CNAME records that are dangling or have been taken over. If subdomains are found to be dangling or have been taken over, remove the vulnerable subdomains and mitigate the risks with the following steps:

If your application logic is such that secrets such as OAuth credentials were sent to the dangling subdomain, or privacy-sensitive information was sent to the dangling subdomains, that data might have been exposed to third-parties.

When creating DNS entries for Azure App Service, create an asuid.subdomain TXT record with the Domain Verification ID. When such a TXT record exists, no other Azure Subscription can validate the Custom Domain that is, take it over.

In order to ensure optimal deliverability and successful delivery of emails to Gmail addresses, Journey Optimizer allows you to add special Google site verification TXT records to your subdomain to make sure that it is verified.

dnsmap scans a domain for common subdomains using a built-in or an externalwordlist (if specified using -w option). The internal wordlist has around 1000words in English and Spanish as ns1, firewall servicios and smtp. So will bepossible search for inside automatically. Resultscan be saved in CSV and human-readable format for further processing. dnsmapdoes NOT require root privileges to be run, and should NOT be run with suchprivileges for security reasons.

Any "polite" bot will request the robots.txt file at So this request must serve the necessary response. There is nothing you can do in a "parent" robots.txt file to influence other domains/subdomains, since it's simply never requested for this hostname.

This specifically looks for the .dev. subdomain (after the client subdomain) in the request. If found then it internally rewrites any request for robots.txt to robots-disallow.txt. Where robots-disallow.txt consists of something like:

say I bought from godaddy. but I only ever want to use the subdomain I want my office365 email to come from, not I want to browse to , or even , but never 041b061a72


Welcome to the group! You can connect with other members, ge...


bottom of page