Proposed Security.txt will work like Robots.txt
Ed Foudil, a web developer and security researcher, has submitted a draft to the IETF — Internet Engineering Task Force — seeking the standardization of security.txt, a file that webmasters can host on their domain root and describe the site’s security policies.
The file is akin to robots.txt, a standard used by websites to communicate and define policies for web and search engine crawlers.
Security.txt is for security-related problems
The distinction between security.txt and robots.txt is that security.txt will be used to communicate a company’s security practices only, and is likely to be read by humans, rather than automated scanners.
For example, if a security researcher finds a security vulnerability on a website, he can access the site’s security.txt file for information on how to contact the company and securely report the issue.
According to the current IETF draft, website owners would be able to create security.txt files that look like this:
#This is a comment Contact: email@example.com Contact: +1-201-555-0123 Contact: https://example.com/security Encryption: https://example.com/pgp-key.txt Acknowledgement: https://example.com/acknowledgements.html Disclosure: Full
Infosec community welcomed the idea
Speaking to Bleeping Computer, Foudil says he came up with the idea after attending the DEF CON security conference and the H1702 CTF event in the US at the start of August.
“During that time I was reflecting on the amazing contributions some of the people from the events in [Las] Vegas make to the security industry and our society as a whole,” Foudil told Bleeping. “This motivated me to stop keeping my ideas to myself and start working on projects and sharing my ideas.”
This is when Foudil put together a first version of the security.txt specification that he later published on GitHub. Early feedback from the IT security industry convinced the researcher to go on.
“When x0rz [well-known security researcher] tweeted about my proposal I realized that this was something people really wanted and that it was time to start writing up an RFC draft,” Foudil said.
The researcher had lots of help from people in the infosec industry. Foudil says feedback from HackerOne, Bugcrowd, Google, and others helped him shape his IETF proposal.
Start small now. Improve and get better later.
The current IETF draft of security.txt only includes support for four directives (Contact, Encryption, Disclosure, and Acknowledgement). The security.txt GitHub repo lists many more directives, such as In-scope, Out-of-scope-vuln, Rate-limit, Platform, Reward, Payment-method, Currency, Donate, and Disallow.
Foudil explained to Bleeping Computer why he axed most of the directives, which in hindsight, were quite helpful and would have given security.txt more depth.
“A major tech company told me that I am better off starting slow and seeing how companies start using security.txt, then with the help of those companies’ feedback, security.txt can be adapted with new directives,” Foudil said.
“Casey Ellis used the expression ‘axe first, then sandpaper’, implying that now that I have all these ideas, I should start chiseling away at them one by one ending up with a well-thought through draft,” the researcher added.
“Now that I have published the Internet draft I get emails on a regular basis with valuable input that allows me to see where possible changes could be made in the future,” Foudil said.
Bug bounty platforms have offered to help
Right now, security.txt is at the status of Internet Draft, which is the first IETF regulatory step in a three-stage process that also includes RFC (Request For Comment) and official Internet Standards.
“Once security.txt becomes an RFC the focus will shift to spreading the word and encouraging companies to setup a security.txt file,” Foudil told Bleeping Computer.
“Several bug bounty platforms have already offered to help out with this step and hopefully if some of the big companies have a security.txt this will set a good example that could convince others to follow suit.”