Fix merge problem in pull request #43. Previously, there was just one for loop that iterated thru all of the sites. But, with the addition of the parallel functionality, there are now two for loops. The dictionary changes were not done in the second loop, which caused bogus results.

pull/55/head
Christopher K. Hoadley 5 years ago
parent f9d59270a3
commit 8090a96c57

@ -143,7 +143,7 @@ def sherlock(username, verbose=False, tor=False, unique_tor=False):
results_total[social_network] = results_site
# Core logic: If tor requests, make them here. If multi-threaded requests, wait for responses
for social_network in data:
for social_network, net_info in data.items():
# Retrieve results again
results_site = results_total.get(social_network)

Loading…
Cancel
Save