FEATURES :
- uses multiple search engines ( google , bing etc )
- proxy support ( can work without proxies as well )
- automatic checker to check if the links are sql injectable
TUTORIAL :
Prerequisites
1. Python 3 installed on your system.
2. The `requirements.txt` file in the same directory as your script.Installation
1. Open a terminal or command prompt.
2. Navigate to the directory containing your script and the `requirements.txt` file.
3. Run the following command to install the required dependencies:
pip install -r requirements.txtUsage
Run the script using the following command:
bash
python sql.pyThe script will prompt you with the following options:
text
Choose an option:
(1) Only search for the selected search engine using dorks
(2) Search the selected search engine for dorks and check which link is SQL injectable
Your choice (1 or 2):Enter 1 to only search for the selected search engine using dorks, or 2 to search and check for SQL injection vulnerabilities.
If you choose option 1, the script will ask you:
text
Do you want to use proxies? (yes/no):Enter yes if you want to use proxies, or no if you don't.
If you choose to use proxies, provide the following information:
text
Enter the path to the proxy file (formatted as url:username:password):
Enter the type of proxies (http, https, socks5, etc.):Enter the path to the proxy file (formatted as url:username:password) and the type of proxies you are using (e.g., http, https, socks5).
The script will then prompt you to choose the search engine:
text
Which search engine do you want to scrape? (google/bing/duckduckgo):
Enter the name of the search engine you want to use (google, bing, or duckduckgo).
Next, enter your dork query or choose from the example dorks provided:
Enter your dork query (e.g., inurl:"product.php?pid=") or choose from these dorks: inurl:"index.php?id=", inurl:"trainers.php?id=", ...
Finally, enter the number of pages you want to scrape:
Enter the number of pages to scrape:
The script will start searching the selected search engine using the provided dork query and save the found links to the found_links.txt file.
If you chose option 2, the script will also check the found links for SQL injection vulnerabilities and save the results to the vulnerable_links.txt file.
Thanks