Python Forum

Full Version: Request Delay
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi,
i m using requests and bs4 for https://www.binance.com/en/support/annou...9?navId=49 this website. get request per 3 seconds for detecting latest news.

My codes ;
latest_news_url = "" while True: time.sleep(3) source = requests.get("https://www.binance.com/en/support/announcement/c-49? avId=49") soup = BeautifulSoup(source.text, 'html.parser') ... ... lastest_news_url_detected = ... if (lastest_news_url_detected == latest_news_url): print("no news detected") print(current_time) else: latest_news_url = lastest_news_url_detected print (shared time) print(current time)
my current time is right,also website shared time is right. but my code working wrong..Its detecting after 7-8mins. For example latest news shared at 20.00- my console;
Output:
19.59.57 -no news detected 20.00.00 no news detected 20.00.03 no news detected .. .. .. 20.08.00 detect! shared time = 20.00.00
I hope I was able to explain my problem. What should i do ? Why is this happening? Cache? cookies? ..?
It's better to use schedule for this that run in a while true loop with time.sleep() added.
schedule would works fine this.
A example, so write to disk once then check against that file,if updated start again.
import schedule import time, datetime import requests from bs4 import BeautifulSoup def news(url): response = requests.get(url) soup = BeautifulSoup(response.content, 'lxml') news = soup.select_one('#link-0-0-p1').get('href') print('News run') with open('news.txt', 'w') as f: f.write(news) return schedule.CancelJob def check_update(url): response = requests.get(url) soup = BeautifulSoup(response.content, 'lxml') news_update = soup.select_one('#link-0-0-p1').get('href') with open('news.txt') as f: if news_update == f.read(): print('No update on site') print(datetime.datetime.utcnow()) else: print('Site has gotten updated') print(datetime.datetime.utcnow()) # Update start again once news(url) if __name__ == '__main__': url = 'https://www.binance.com/en/support/announcement/c-49?avId=49' schedule.every(10).seconds.do(news, url=url) schedule.every(60).seconds.do(check_update, url=url) while True: schedule.run_pending() time.sleep(1)