Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Request Delay
#1
Hi,
i m using requests and bs4 for https://www.binance.com/en/support/annou...9?navId=49 this website. get request per 3 seconds for detecting latest news.

My codes ;
latest_news_url = "" while True: time.sleep(3) source = requests.get("https://www.binance.com/en/support/announcement/c-49? avId=49") soup = BeautifulSoup(source.text, 'html.parser') ... ... lastest_news_url_detected = ... if (lastest_news_url_detected == latest_news_url): print("no news detected") print(current_time) else: latest_news_url = lastest_news_url_detected print (shared time) print(current time)
my current time is right,also website shared time is right. but my code working wrong..Its detecting after 7-8mins. For example latest news shared at 20.00- my console;
Output:
19.59.57 -no news detected 20.00.00 no news detected 20.00.03 no news detected .. .. .. 20.08.00 detect! shared time = 20.00.00
I hope I was able to explain my problem. What should i do ? Why is this happening? Cache? cookies? ..?
Reply
#2
It's better to use schedule for this that run in a while true loop with time.sleep() added.
schedule would works fine this.
A example, so write to disk once then check against that file,if updated start again.
import schedule import time, datetime import requests from bs4 import BeautifulSoup def news(url): response = requests.get(url) soup = BeautifulSoup(response.content, 'lxml') news = soup.select_one('#link-0-0-p1').get('href') print('News run') with open('news.txt', 'w') as f: f.write(news) return schedule.CancelJob def check_update(url): response = requests.get(url) soup = BeautifulSoup(response.content, 'lxml') news_update = soup.select_one('#link-0-0-p1').get('href') with open('news.txt') as f: if news_update == f.read(): print('No update on site') print(datetime.datetime.utcnow()) else: print('Site has gotten updated') print(datetime.datetime.utcnow()) # Update start again once news(url) if __name__ == '__main__': url = 'https://www.binance.com/en/support/announcement/c-49?avId=49' schedule.every(10).seconds.do(news, url=url) schedule.every(60).seconds.do(check_update, url=url) while True: schedule.run_pending() time.sleep(1)
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Is it possible to add a delay right after a request.get() cubangt 6 13,585 Sep-07-2023, 09:29 AM
Last Post: shoesinquiry
  how can I correct the Bad Request error on my curl request tomtom 8 8,701 Oct-03-2021, 06:32 AM
Last Post: tomtom
  adding a delay on end Daz2264 6 4,823 Sep-29-2021, 02:57 PM
Last Post: deanhystad
  python delay without interrupt the whole code Nick_tkinter 4 9,670 Feb-22-2021, 10:51 PM
Last Post: nilamo
  configure delay on only one link using python3 HiImAl 3 4,594 Oct-21-2020, 07:51 PM
Last Post: buran
  ImportError: cannot import name 'Request' from 'request' abhishek81py 1 6,078 Jun-18-2020, 08:07 AM
Last Post: buran
  Keyboard commands and delay/latency RungJa 0 3,786 Mar-29-2020, 01:28 PM
Last Post: RungJa
  Vpython Delay in plotting points SohaibAJ 0 3,009 Jul-30-2018, 08:44 PM
Last Post: SohaibAJ
  PyAudio delay hast 1 6,856 Jan-23-2018, 09:19 PM
Last Post: j.crater

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020
This forum uses Lukasz Tkacz MyBB addons.
Forum use Krzysztof "Supryk" Supryczynski addons.