Not provided by employer
Description:
Job summary: develops, maintains, and supports policies and procedures for ensuring the functionality and accuracy of the programs used to spider labor data from external sitesRelies on direction and good judgment to plan and accomplish goalsWorks under general supervision.
Duties and responsibilities:
Supports and documents new development of spidering robot programs (bots) to quickly and accurately extract data from external websites
Ensures all bots meet full qa standards
Ensures assigned bots are fully functional and resolves problems when they are not
Meets performance metrics as expected, with goals on a weekly and monthly basis
As experience grows will have an understanding of the bot library, spidering software, and database processing of extracted data
Know what situations require the implementation and use of a proxy
Able to maintain an average of 70 – 175 bot repairs per week with minimal qa issues (depends on experience and the mix of large/medium/small bots)
May be required to perform manual bot qa with reasonable speed and accuracy
Requirements:
Knowledge, skills and abilities:
Understanding of computer programming methodologies
Superior analytical skills
Able to quickly learn and adapt to new software
Basic knowledge of html, web technologies, and programming languages
Basic sql knowledge
Capable of writing basic sql queries to check on the results of assigned bots and basic tables used to store job data
Familiar with design studio, microsoft sql server , and microsoft team foundation server
Basic understanding of database tables directly related to kapow software and all tables involving storage of spidered jobs
Demonstrates good comprehension of the data flow from the spidering applications through the databases to the production environment
Credentials and experience:
Two year degree in computer science or equivalent work experience
Equal opportunity employerM/f/d/v
Hiring insights
Job activity
Posted 30+ days ago