|
You are here |
danielms.site | ||
| | | | |
www.4async.com
|
|
| | | | | ???Go?1.17???????????????????????1.17????????????????????????1.18????????????????6?????????????????????Go 1.18??????????1.17??????????????? ???????? ???Go????????????????????????????????????interface{}????????????????????????????????????????????????????????????????????Go??????????????????????????????????????????????????????interface{}????????????????????????????????????????????????????????????????????????????????????????????? ?Go????????????????????????????????????????????????????????????????????????????... | |
| | | | |
hjr265.me
|
|
| | | | | Yes, I know there are paid and free tools for doing this. And yes, I know there are tools for this that I can run locally. But this exercise allowed me to try out the well-designed Go package github.com/gocolly/colly. Colly is a web scraping framework for Go. Here is how I used it to quickly scan my website (the one you are on right now) for broken links. First I defined a type for links to check and the URL of the page they appear on: | |
| | | | |
mfbmina.dev
|
|
| | | | | Nowadays, a huge part of a developer's work consists in calling APIs, sometimes to integrate with a team within the company, sometimes to build an integration with a supplier. The other big role in daily work is to write tests. Tests ensure (or should guarantee :D) that all the code written by us works on how it is expected and, therefore, it will not happen any surprises when the feature is running at production environment. | |
| | | | |
karlherrick.com
|
|
| | | |||