Merge branch 'main' into main_swisscows

This commit is contained in:
Edoardo Ottavianelli 2023-10-16 15:10:21 +02:00 committed by GitHub
commit c5724a8f57
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 46 additions and 13 deletions

22
.github/workflows/check-duplicates.yml vendored Normal file
View File

@ -0,0 +1,22 @@
name: Check Duplicates
on:
pull_request:
branches:
- main
push:
branches:
- main
jobs:
check-duplicates:
runs-on: ubuntu-latest
steps:
- name: Check Out Code
uses: actions/checkout@v2
- name: Run Check Duplicates Script
run: |
chmod +x scripts/check-dups.sh
./scripts/check-dups.sh
working-directory: ${{ github.workspace }}

View File

@ -12,7 +12,7 @@ A curated list of awesome search engines useful during Penetration testing, Vuln
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#vulnerabilities" target="_blank">Vulnerabilities</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#exploits" target="_blank">Exploits</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#attack-surface" target="_blank">Attack surface</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#code-search-engines" target="_blank">Code</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#code" target="_blank">Code</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#mail-addresses" target="_blank">Mail addresses</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#domains" target="_blank">Domains</a>
<a href="https://github.com/edoardottt/awesome-hacker-search-engines#urls" target="_blank">URLs</a>
@ -42,6 +42,7 @@ A curated list of awesome search engines useful during Penetration testing, Vuln
- [You](https://you.com/)
- [SearXNG](https://searx.be/?q=)
- [EXALead](http://www.exalead.com/search/web/)
- [DuckDuckGo](https://duckduckgo.com/)
- [Swisscows](https://swisscows.com/en)
### Servers
@ -128,7 +129,7 @@ A curated list of awesome search engines useful during Penetration testing, Vuln
- [Deepinfo](https://www.deepinfo.com/) - Empower your security with the most comprehensive Internet data
- [Detectify](https://detectify.com/) - Complete External Attack Surface Management
### Code Search Engines
### Code
- [GitHub Code Search](https://github.com/search?type=code) - Search globally across all of GitHub, or scope your search to a particular repository or organization
- [GitLab Code Search](https://gitlab.com/) - Advanced search for faster, more efficient search across the entire GitLab instance
@ -194,7 +195,6 @@ A curated list of awesome search engines useful during Penetration testing, Vuln
- [MoonSearch](http://moonsearch.com/) - Backlinks checker & SEO Report
- [sitereport.netcraft.com](https://sitereport.netcraft.com/) - Find out the infrastructure and technologies used by any site
- [SynapsInt](https://synapsint.com/) - The unified OSINT research tool
- [spyonweb.com](https://spyonweb.com/) - Find out related websites
- [statscrop.com](https://www.statscrop.com/) - Millions of amazing websites across the web are being analyzed with StatsCrop
- [securityheaders.com](https://securityheaders.com/) - Scan your site now
- [visualsitemapper.com](http://www.visualsitemapper.com/) - Create a visual map of your site
@ -375,6 +375,9 @@ These can be useful for osint and social engineering.
- [Infringement Report](https://infringement.report/) - The web's best image copyright infringement search tool
- [Tineye](https://tineye.com/) - Image search and recognition company
- [Flickr](https://flickr.com/search/) - Home to tens of billions of photos and 2 million groups
- [Sogou](https://pic.sogou.com/) - Chinese technology company that offers a search engine
- [Jimpl](https://jimpl.com/) - Online photo metadata and EXIF data viewer
- [Same Energy](https://same.energy/) - Find beautiful images
### Threat Intelligence

View File

@ -14,15 +14,23 @@ then
readme="../README.md"
fi
links=$(cat $readme | egrep "\- \[" | wc -l)
# Function to extract links from a section and check for duplicates
check_section() {
section=$1
section_content=$(awk -v section="$section" '/^### / {p=0} {if(p)print} /^### '"$section"'/ {p=1}' "$readme")
duplicate_links=$(echo "$section_content" | grep -oP '\[.*?\]\(\K[^)]+' | sort | uniq -d)
if [[ -n $duplicate_links ]]; then
echo "[ ERR ] DUPLICATE LINKS FOUND"
echo "$duplicate_links"
exit 1
fi
}
uniqlinks=$(cat $readme | egrep "\- \[" | uniq | wc -l)
# Get all unique section headings from the README file and handle spaces and slashes
sections=$(grep '^### ' "$readme" | sed 's/^### //' | sed 's/[\/&]/\\&/g')
if [[ $links -eq $uniqlinks ]];
then
echo "[ OK! ] NO DUPLICATES FOUND."
echo "$links links in README."
else
echo "[ ERR ] DUPLICATES FOUND!"
cat $readme | egrep "\- \[" | uniq -c | egrep -iv "1 - ["
fi
# Call the function for each section
for section in $sections; do
check_section "$section"
done
echo "[ OK! ] NO DUPLICATES FOUND."