After working several projects about skimming data in GO. I have some experiences about it. These are :
Tool for crawler (Go vs PuerkitoBio/goquery)
Before Go, i tried with Python (Scrapy), it works, but i did not feel comfortable with it. I do not know why, but after trying Go i do not want to think about Python anymore. Maybe, it is enough for me now. The only third party i need to import is PuerkitoBio/goquery. It is used for extracting response body.
Chrome and debugger
Before sticking with Chrome, i used to try so many Chrome extension like Web Sniffer to catch requests. But after using Debugger in Chrome for a while . I recognize, it is all i need to see the requests. Only 1 thing maybe you need to pay attention is the setting Preserve log, it will not clear the requests.
Understanding the way to make request with the body in GO
Because we do not create that website’s back-end, so we do not know about it. We have to follow the request’s information. Query String Parameters
After a while doing this things, sometimes it makes me crazy, do not understand why it does not work although i things did extactly. But it looks like a game, when you figure out what the point is, you win the game. I am really excited about that moment. Try, read message error, try again.
It has a lot more things to achieve, to discover. But again, with Go i feel very comfortable about the simple and efficiency.