Scraping LinkedIn Jobs with Ease Using ScrapingDog

SaaS Tools and Apps
Surfer AI - Best All-in-one Assistant

- Your article will be ready in less than 20 minutes and it will be 10 times cheaper than using a dedicated writer.
- Create ready-to-rank articles in minutes with Surfer AI.
- Research, write, and optimize across industries with the click of a button.


Regardless of whether you are a recruiter searching for customers, a income rep, or genuinely just a person who wants to scrape one particular of the greatest occupation boards on the globe…well you are in luck.

We’ll be going in excess of how to scrape, enrich, and place with each other a prospecting record from scratch employing ScrapingDog.

Just before we start off, make confident you have accounts for all of these: ChatGPT, ScrapingDog, Google Sheets and LinkedIn. 

Let’s dive in! 

What is ScrapingDog?

Believe of getting thirty virtual assistants manually scraping information from a web site.

Now envision getting that very same capability with no personnel and it benefits in milliseconds. That is Scrapingdog. It enables solopreneurs to enterprise businesses gather beneficial information from sites that might be helpful for both their merchandise or income approach. 

Scrapingdog enables you to scrape websites like Twitter, LinkedIn, LinkedIn jobs, zillow and much more.

For the functions of this guidebook, we’ll be focusing mainly on LinkedIn Jobs.

Scraping LinkedIn Jobs with ScrapingDog

This particular use situation does demand a bit of coding knowledge, but I’ll stroll you via the stage-by-stage approach so you can stick to along. 

Phase one: Discover your LinkedIn Occupation Webpage

If you are searching to scrape LinkedIn’s Occupation board, you almost certainly have a particular niche or occupation title you are searching for. For illustration, businesses searching to retain the services of developers in the US, or businesses searching to retain the services of Search engine optimization professionals. This is the place you have to have your best persona dialed in. 

For this illustration, let’s presume we’re searching for businesses that are employing application developers in New York City. Rather particular, but this is how you have to consider when constructing your prospecting lists. The much more niched and segmented you are, the much better you can tailor your messaging to genuinely resonate with your target client. 

Let’s go ahead and place that with each other. 

In the screenshot over, there are a couple of factors I need to have to stage out that are essential for this physical exercise. We will need to have to get note of the URL, the business identify, and occupation position. They’ll be essential right here shortly. 

Now that we have our commencing stage, we’re prepared to use Scrapingdog. ‘

Phase two: Scraping with Scrapingdog

Assuming you developed an account on Scrapingdog previously, we’ll need to have to grab the API essential. As quickly as you login, you will see your API in a couple of diverse locations. Picture beneath displays the place you’d be capable to discover it. 

Sadly for us, you are not capable to scrape LinkedIn jobs by means of their application. It has to be through their API so we’ll need to have to do a small bit of coding. 

When you have your API, we’re now prepared to do an API phone. To discover their API, you can go to this webpage here.

This is their API. 

import requests

payload = {'api_key': 'APIKEY', 'field':'Python', 'geoid:'100293800', 'page':'1'}

resp = requests.get('https://api.scrapingdog.com/linkedinjobs', params=payload)

print (resp.json())

Let’s break down the essential elements of this. The “API Key”, the “Field”, “Page” and “Geoid”. 

Bear in mind how we set up our LinkedIn Occupation webpage? That is the place we will grabbing the information we need to have to fill in our API phone. 

This is the URL of the LinkedIn Occupation webpage. We’re searching for businesses searching to retain the services of application developers in New York City. 

application%20developers – this is our “Field”

The initial webpage of our LinkedIn Occupation search – “Page”

90000070 – this is our “Geoid”

12345678910 – this is our API Essential (This is not true. Be confident to fill in your very own API essential

Now that we have our data, let’s edit our API Contact.

import requests

payload = {'api_key': '648afc678ce197654ff118fc'', 'field':'software%20developers', 'geoid:'90000070', 'page':'1'}

resp = requests.get('https://api.scrapingdog.com/linkedinjobs', params=payload)

print (resp.json())

Now that this is adjusted with our wanted data, we’re virtually prepared to scrape. 

Phase three: Grabbing the information

Bear in mind that the documentation presented by Scraping canine is just that…documentation. We need to have to flip this into what’s named a CURL Request. 

To do this, we’re going to use the support of ChatGPT. Presume you are capable to do CURL Requests till you recognize you cannot. If you recognize you cannot, then Google how to do it on your personal computer. 

We’re now going to copy our new API phone from over and paste that into ChatGPT. We’re going to prompt this to flip this into a CURL Request. 

This is what you need to see. 

Now that you have your CURL Request, you are going to open up “Terminal” on your personal computer. I am not confident how this is carried out on Windows, but you can Google it. This is particular to Mac as that is what I’m at present employing. 

This is what you need to see when you open up the application. 

Now COPY the CURL Request that ChatGPT produced and paste it immediately into the terminal. 

If you did every thing appropriately, you need to now see a crazy blob of text like the picture beneath, but do not panic! It signifies you did every thing appropriately. 

This crazy blob of text is named JSON. This consists of the data we had been searching for. The business identify, the position, the spot, and so forth. Actually every thing about the occupation listing. 

Now we need to have to flip this into a readable format. Go ahead and copy the total text/outcome that was produced. 

You are going to paste it immediately into GPT and prompt it – “Can you flip this into readable JSON format? “

If you did it appropriately, you need to now see the picture beneath. 

You are going to want to click “Continue generating” a couple of diverse occasions till it is completed. You need to have twenty-25 occupation listings at the finish of it. To double examine, you can request ChatGPT – “How several occupation postings are listed over?”

Phase four: Organizing The Information

Now that we have the information in a readable and digestible format, we’re going to request ChatGPT to pull the record of business names and the occupation roles. 

Right here we have the record of businesses.

Right here is the record of occupation roles, and they correlate to the business names. 

Now we can paste this information immediately into a Clay table and this is what it need to search like. 

And now we’re prepared to enrich! Content prospecting!! 

タイトルとURLをコピーしました