fix robots.txt proxy down test by setting site.id (cached robots is stored by site.id, and other tests that ran earlier with no site.id were interfering); and test another kind of connection error, for whatever that's worth

This commit is contained in:
Noah Levitt 2017-04-18 12:00:23 -07:00
parent dc43794363
commit ac972d399f
3 changed files with 45 additions and 42 deletions

View file

@ -32,7 +32,7 @@ def find_package_data(package):
setuptools.setup(
name='brozzler',
version='1.1b11.dev233',
version='1.1b11.dev234',
description='Distributed web crawling with browsers',
url='https://github.com/internetarchive/brozzler',
author='Noah Levitt',