CrossLinked – Tool to Extract Valid Employee Names From LinkedIn Through Search Engine Scraping

CrossLinked - Tool to Extract Valid Employee Names From LinkedIn Through Search Engine Scraping

CrossLinked

CrossLinked is a LinkedIn enumeration tool that uses search engine scraping to collect valid employee names from a target organization. This technique provides accurate results without the use of API keys, credentials, or even accessing the site directly. Formats can then be applied in the command line arguments to turn these names into email addresses, domain accounts, and more.

CrossLinked simplifies the processes of searching LinkedIn to collect valid employee names when performing password spraying or other security testing against an organization. Using similar search engine scraping capabilities found in tools like subscraper and pymeta.

For a full breakdown of the tool and example output, checkout:
https://m8r0wn.com/posts/2021/01/crosslinked.html

Setup

git clone https://github.com/m8r0wn/crosslinked cd crosslinked pip3 install -r requirements.txt

Examples

python3 crosslinked.py -f ‘{first}.{last}@domain.com’ company_name

python3 crosslinked.py -f ‘domain\{f}{last}’ -t 45 -j 0.5 company_name

Usage

positional arguments: company_name Target company nameoptional arguments: -h, –help show this help message and exit -t TIMEOUT Max timeout per search (Default=20, 0=None) -j JITTER Jitter between requests (Default=0) -v Show names and titles recovered after enumerationSearch arguments: -H HEADER Add Header (‘name1=value1;name2=value2;’) –search ENGINE Search Engine (Default=’google,bing’) –safe Only parse names with company in title (Reduces false positives)Output arguments: -f NFORMAT Format names, ex: ‘domain\{f}{last}’, ‘{first}.{last}@domain.com’ -o OUTFILE Change name of output file (default=names.txtProxy arguments: –proxy PROXY Proxy requests (IP:Port) –proxy-file PROXY Load proxies from file for rotation


Proxy Support

The latest version of CrossLinked provides proxy support through the Taser library. Users can mask their traffic with a single proxy by adding --proxy 127.0.0.1:8080 to the command line arguments, or use --proxy-file proxies.txt for rotating source addresses.

http/https proxies can be added in IP:PORT notation, while SOCKS requires a socks4:// or socks5:// prefix.

Screeshots

crosslink - LinkedIn enumeration tool to extract valid employee names from an organization through search engine scraping
enumeration process
crosslink - extract linkedin data to get valid employee name from company or organisations xploitlab
result

Additions

Two additional scripts are included in this repo to aid in generating potential username and password files:

  • pwd_gen.py – Generates custom password lists using words and variables defined at the top of the script. Perform number/letter substitutions, append special characters, and more. Once configured, run the script with no arguments to generate a ‘passwords.txt’ output file.
  • user_gen.py – Generates custom usernames using inputs from firstname.txt and lastname.txt files, provided at the command line. Format is defined similiar to crosslinked.py and will be written to ‘users.txt’.
python3 user_gen.py -first top100_firstnames.txt -last top100_lastnames.txt -f “domain\{f}{last}”


You May Also Like

Leave a Reply

Your email address will not be published. Required fields are marked *

eleven − 9 =