9 support issues
- All
- Questions
- Suggestions
- Problems
Sakib Islam
Nov 30, 2022
Very Good Tools
This extension is very good, I’m searching a good extension but finally I got a a great extension. Here it is.
https://chrome.google.com/webstore/detail/data-scraper-20/jnofnfpjkakadjhakglcailcjeeicfjn
- Report illegal content
- Copy link
Anton Kaufmann
May 31, 2022
getdata.io
Hi Teh Gary,
Thank you for getdata.io - it is a really great extension! Kudos to you for your outstanding programming skills!
Unfortunately, your documentation is not so helpful if one wants to scrape a bit more complex data.
I hope you can answer me some questions for which I could not find any answer in your docs. Thank you very much for your help!
Basics:
I try to collect data from the website https://bizbuysell.com
1.) I have a list of several single webpage URLs of this website
Can I just add several URLs in addition to the one in the following code sample
"engine": "chrome",
"client_version": "5.0.28",
"origin_url": "https://www.bizbuysell.com/Business-Opportunity/Very-Profitable-Landscaping-Design-and-Sod-Business/1827360/",
"actions": [
Or, how should I do it
2.) I found that the webpage itself has 41 Data Fields that I want to get the data from.
The problem is that the data fields are dynamically populated; only those data fields are shown that have a value. Data Fields that have no value (= empty) are not shown. Thus each webpage to get data from has a unique set of these 41 data fields shown.
I thought that a conditional command could solve the problem. Some code for the following task:
"If Data Field ABCD is not empty the get data if it is empty then go next data field"
Is that possible?
3.) Can I write the code in an external editor and copy/paste it into the Chrome app?
4.) Do I need the cookies? If yes, how could I get those
Thank you for your help! Is there another, larger documentation available somewhere?
Best Regards,
Tony
P.S. My Google email address is my regular email address
- Report illegal content
- Copy link
Netspeedz
Jan 25, 2022
Scrolling
Is it possible to instruct GetData.IO to scroll down a page (vs clicking on a 'Next' button) until all items are loaded (i.e., similar to what Google Images does when searching for an image)?
- Report illegal content
- Copy link
Alex L.P.
Nov 5, 2021
Scraping data to Notion
hello, would it possible to scrape info from a public page, connect the scraper with Notion api, and deploy the info in a specific table in Notion?
- Report illegal content
- Copy link
yunus emre
Feb 20, 2021
PROGRAM HATASI
VERİLERİMİ İNDİREMİYORUM..
- Report illegal content
- Copy link
mapaydinbilgi
Aug 16, 2019
hi, I am writing from Turkey
I like your application very much. I can do a lot of work thanks to this application. I did a few attempts, but no Turkish character support. And when I get the data, there are line shifts. Normally I want to get premium but unfortunately I can't get it if these problems occur. Can we handle this?I was able to write this through translation.if not fully understood, we can discuss on the mail
bilgimapaydin@gmail.com
- Report illegal content
- Copy link
D2E EV2Europe
Feb 11, 2019
Need your help. how to choose from the roll butom automatically
I am trying to scaper all the dealer list from a website:
http://www.chery.cn/buysupport/dealerlist/
But it has 3 roll-button to choose the Model, Province, and City. How I can set up and make it change the roll button automatically?
- Report illegal content
- Copy link
Vinayak Parasher
Aug 10, 2016
how to use
I have downloaded the extension. As i was trying to scrape data from wiki page. I activated the extension which appeared in right corner of my browser but could not get it working. Thanks for suggestions.
- Report illegal content
- Copy link
Steven Green
Jul 23, 2013
Help
Good afternoon,
I have been trying with limited success to operate Krake for the last two days as a test format.
It really is an interesting service and I would like to upgrade but I need to know I can get the functionality I require out of it.
Could you please contact me to discuss.
Thank you and best regards,
- Report illegal content
- Copy link