問題描述
如何在 Python 中獲取毫秒和微秒分辨率的時間戳?
我還想要類似 Arduino 的 delay()
(以毫秒為單位延遲)和 delayMicroseconds()
函數(shù).
我在問這個問題之前閱讀了其他答案,但它們依賴于 time
模塊,在 Python 3.3 之前,該模塊沒有任何類型的保證分辨率.它的分辨率無處不在.here 最受好評的答案引用了 16 毫秒的 Windows 分辨率(使用他們的答案),這比我在此處提供的答案(0.5 us 分辨率)差 32000 倍.同樣,我需要 1 ms 和 1 us(或類似)分辨率,不是16000 us分辨率.p>
相關(guān):
- 我自己對如何做的回答C++ 中的相同內(nèi)容(獲取 ms 和 us 分辨率時間戳)
適用于 Windows: 這是一個適用于 Linux(也適用于 Python 3.3 之前版本)和 Windows 的全功能模塊:
函數(shù)和代碼示例.
功能包括:
- micros()
- millis()
- 延遲()
- delayMicroseconds()
Python代碼模塊:
""GS_timing.py- 創(chuàng)建一些類似 Arduino 的低級 millis()(毫秒)和 micros()Python 的(微秒)計(jì)時函數(shù)加布里埃爾·斯臺普斯http://www.ElectricRCAAircraftGuy.com- 點(diǎn)擊聯(lián)系我";在我的網(wǎng)站頂部找到我的電子郵件地址開始時間:2016 年 7 月 11 日更新日期:2016 年 8 月 13 日歷史(最新):20160813 - 創(chuàng)建 v0.2.0 - 添加了 Linux 兼容性,使用 ctypes,使其與 Python 3.3 之前的版本兼容(對于 Python 3.3 或更高版本,只需使用 Linux 的內(nèi)置時間函數(shù),如下所示:https://docs.python.org/3/library/time.html)-ex: time.clock_gettime(time.CLOCK_MONOTONIC_RAW)20160711 - 創(chuàng)建了 v0.1.0 - 函數(shù)*僅適用于 Windows*(通過 QPC 計(jì)時器)參考:視窗:-個人(C++ 代碼):GS_PCArduino.h1) 獲取高分辨率時間戳 (Windows)-https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx2) QueryPerformanceCounter 函數(shù) (Windows)-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644904(v=vs.85).aspx3) QueryPerformanceFrequency 函數(shù)(Windows)-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644905(v=vs.85).aspx4) LARGE_INTEGER 聯(lián)合 (Windows)-https://msdn.microsoft.com/en-us/library/windows/desktop/aa383713(v=vs.85).aspx-*****https://stackoverflow.com/questions/4430227/python-on-win32-how-to-get-絕對時序 CPU 周期計(jì)數(shù)LINUX:-https://stackoverflow.com/questions/1205722/how-do-i-get-monotonic-time-durations-in-python""導(dǎo)入 ctypes,操作系統(tǒng)#常量:版本 = '0.2.0'#------------------------------------------------------------------#職能:#------------------------------------------------------------------#OS 特定的低級計(jì)時功能:if (os.name=='nt'): #for Windows:定義微():以微秒(us)為單位返回時間戳"抽動 = ctypes.c_int64()頻率 = ctypes.c_int64()#獲取內(nèi)部?2MHz QPC時鐘的滴答聲ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))#獲取實(shí)際頻率.內(nèi)部 ~2MHz QPC 時鐘ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))t_us = tics.value*1e6/freq.value返回 t_us定義毫秒():以毫秒 (ms) 為單位返回時間戳"抽動 = ctypes.c_int64()頻率 = ctypes.c_int64()#獲取內(nèi)部?2MHz QPC時鐘的滴答聲ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))#獲取實(shí)際頻率.內(nèi)部 ~2MHz QPC 時鐘ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))t_ms = tics.value*1e3/freq.value返回 t_mselif (os.name=='posix'): #for Linux:#常量:CLOCK_MONOTONIC_RAW = 4 # 見 <linux/time.h>這里:https://github.com/torvalds/linux/blob/master/include/uapi/linux/time.h#prepare {long, long} 的 ctype timespec 結(jié)構(gòu)類時間規(guī)范(ctypes.Structure):_字段_ =[('tv_sec', ctypes.c_long),('tv_nsec', ctypes.c_long)]#配置Python訪問clock_gettime C庫,通過ctypes:#文檔:#-ctypes.CDLL:https://docs.python.org/3.2/library/ctypes.html#-librt.so.1 與clock_gettime:https://docs.oracle.com/cd/E36784_01/html/E36873/librt-3lib.html #-#-Linux clock_gettime(): http://linux.die.net/man/3/clock_gettimelibrt = ctypes.CDLL('librt.so.1', use_errno=True)clock_gettime = librt.clock_gettime#指定C clock_gettime() 函數(shù)的輸入?yún)?shù)和類型# (int clock_ID, timespec* t)clock_gettime.argtypes = [ctypes.c_int, ctypes.POINTER(timespec)]def 單調(diào)時間():以秒 (sec) 為單位返回時間戳"t = 時間規(guī)格()#(請注意,clock_gettime() 返回 0 表示成功,或 -1 表示失敗,在# 哪種情況 errno 設(shè)置得當(dāng))#-見這里:http://linux.die.net/man/3/clock_gettime如果clock_gettime(CLOCK_MONOTONIC_RAW,ctypes.pointer(t))!= 0:#if clock_gettime() 返回錯誤errno_ = ctypes.get_errno()引發(fā) OSError(errno_, os.strerror(errno_))返回 t.tv_sec + t.tv_nsec*1e-9 #sec定義微():以微秒(us)為單位返回時間戳"返回 monotonic_time()*1e6 #us定義毫秒():以毫秒 (ms) 為單位返回時間戳"返回 monotonic_time()*1e3 #ms#其他計(jì)時功能:定義延遲(延遲毫秒):delay_ms 毫秒 (ms) 的延遲"t_start = 毫秒()而 (millis() - t_start < delay_ms):通過#什么都不做返回def delayMicroseconds(delay_us):延遲延遲_us 微秒 (us)"t_start = micros()而 (micros() - t_start < delay_us):通過#什么都不做返回#------------------------------------------------------------------#例子:#------------------------------------------------------------------#如果直接運(yùn)行這個模塊,只執(zhí)行這個代碼塊,#*not* 如果導(dǎo)入它#-見這里:http://effbot.org/pyfaq/tutor-what-is-if-name-main-for.htmif __name__ == "__main__": #if 將此模塊作為獨(dú)立程序運(yùn)行#print循環(huán)執(zhí)行時間100次,使用micros()tStart = micros() #us對于范圍內(nèi)的 x (0, 100):tNow = micros() #usdt = tNow - tStart #us;增量時間tStart = tNow #us;更新打印(dt(我們)="+ str(dt))#print循環(huán)執(zhí)行時間100次,使用millis()打印(
")tStart = millis() #ms對于范圍內(nèi)的 x (0, 100):tNow = millis() #msdt = tNow - tStart #ms;增量時間tStart = tNow #ms;更新打印(dt(毫秒)="+ str(dt))# 每秒打印一次計(jì)數(shù)器,持續(xù) 5 秒,使用延遲打印(
開始")對于范圍內(nèi)的 i (1,6):延遲(1000)打印(一)#使用 delayMicroseconds 每秒打印一次計(jì)數(shù)器,持續(xù) 5 秒打印(
開始")對于范圍內(nèi)的 i (1,6):延遲微秒(1000000)打印(一)
如果您知道如何在 Linux 中獲取上述毫秒和微秒分辨率的時間戳,請發(fā)帖,因?yàn)檫@也會很有幫助.
這也適用于 Linux,包括 Python 3.3 之前的版本,因?yàn)槲彝ㄟ^ ctypes 模塊使用 C 函數(shù)來讀取時間戳.
(注意:上面的代碼最初發(fā)布在這里:http://www.electricrcaircraftguy.com/2016/07/arduino-like-millisecond-and-microsecond-timestamps-in-python.html)
特別感謝 @ArminRonacher 在此處提供的出色的 Python 3.3 Linux 前答案:https://stackoverflow.com/a/1205762/4561887
更新:在 Python 3.3 之前,內(nèi)置 Python 時間庫(https:///docs.python.org/3.5/library/time.html) 沒有任何明確的高分辨率函數(shù).不過,現(xiàn)在它確實(shí)提供了其他選項(xiàng),包括一些高分辨率功能.
然而,我上面的模塊為 Python 3.3 之前和之后的 Python 代碼提供了高分辨率時間戳,并且它在 Linux 和 Windows 上都這樣做.
這是我的意思的一個例子,表明 time.sleep()
函數(shù)不一定是高分辨率函數(shù).*在我的 Windows 機(jī)器上,它的分辨率可能是 8ms 最好,而我上面的模塊有 0.5us 分辨率(16000 倍!) 在同一臺機(jī)器上.
代碼演示:
導(dǎo)入時間導(dǎo)入 GS_timing 作為時間def 延遲微秒(n):time.sleep(n/1000000.)定義延遲毫秒(n):time.sleep(n/1000.)t_start = 0t_end = 0#使用時間.sleepprint('使用時間.sleep')打印('延遲微秒(1)')對于范圍內(nèi)的 x (10):t_start = timing.micros() #us延遲微秒(1)t_end = timing.micros() #usprint('dt (us) = ' + str(t_end - t_start))打印('延遲微秒(2000)')對于范圍內(nèi)的 x (10):t_start = timing.micros() #us延遲微秒(2000)t_end = timing.micros() #usprint('dt (us) = ' + str(t_end - t_start))#使用 GS_timingprint('
使用 GS_timing')print('timing.delayMicroseconds(1)')對于范圍內(nèi)的 x (10):t_start = timing.micros() #us計(jì)時延遲微秒(1)t_end = timing.micros() #usprint('dt (us) = ' + str(t_end - t_start))print('timing.delayMicroseconds(2000)')對于范圍內(nèi)的 x (10):t_start = timing.micros() #us計(jì)時延遲微秒(2000)t_end = timing.micros() #usprint('dt (us) = ' + str(t_end - t_start))
我的 WINDOWS 8.1 機(jī)器上的示例結(jié)果(注意 time.sleep 的效果差多少):
使用 time.sleep延遲微秒(1)dt (我們) = 2872.059814453125dt (我們) = 886.3939208984375dt (我們) = 770.4649658203125dt (我們) = 1138.7698974609375dt (我們) = 1426.027099609375dt (我們) = 734.557861328125dt (我們) = 10617.233642578125dt (我們) = 9594.90576171875dt (我們) = 9155.299560546875dt (我們) = 9520.526611328125延遲微秒(2000)dt (我們) = 8799.3056640625dt (我們) = 9609.2685546875dt (我們) = 9679.5439453125dt (我們) = 9248.145263671875dt (我們) = 9389.721923828125dt (我們) = 9637.994262695312dt (我們) = 9616.450073242188dt (我們) = 9592.853881835938dt (我們) = 9465.639892578125dt (我們) = 7650.276611328125使用 GS_timing計(jì)時延遲微秒(1)dt (我們) = 53.3477783203125dt (我們) = 36.93310546875dt (我們) = 36.9329833984375dt (我們) = 34.8812255859375dt (我們) = 35.3941650390625dt (我們) = 40.010986328125dt (我們) = 38.4720458984375dt (我們) = 56.425537109375dt (我們) = 35.9072265625dt (我們) = 36.420166015625計(jì)時延遲微秒(2000)dt (我們) = 2039.526611328125dt (我們) = 2046.195068359375dt (我們) = 2033.8841552734375dt (我們) = 2037.4747314453125dt (我們) = 2032.34521484375dt (我們) = 2086.2059326171875dt (我們) = 2035.4229736328125dt (我們) = 2051.32470703125dt (我們) = 2040.03955078125dt (我們) = 2027.215576171875
我的 RASPBERRY PI 版本 1 B+ 上的示例結(jié)果(請注意,使用 time.sleep 和我的模塊之間的結(jié)果基本相同......顯然 time
中的低級函數(shù)已經(jīng)可以更好地訪問- 這里的分辨率計(jì)時器,因?yàn)樗且慌_ Linux 機(jī)器(運(yùn)行 Raspbian)...但是在我的 GS_timing
模塊中,我明確調(diào)用了 CLOCK_MONOTONIC_RAW 計(jì)時器.誰知道否則會使用什么):
使用 time.sleep延遲微秒(1)dt (我們) = 1022.0dt (我們) = 417.0dt (我們) = 407.0dt (我們) = 450.0dt (我們) = 2078.0dt (我們) = 393.0dt (我們) = 1297.0dt (我們) = 878.0dt (我們) = 1135.0dt (我們) = 2896.0延遲微秒(2000)dt (我們) = 2746.0dt (我們) = 2568.0dt (我們) = 2512.0dt (我們) = 2423.0dt (我們) = 2454.0dt (我們) = 2608.0dt (我們) = 2518.0dt (我們) = 2569.0dt (我們) = 2548.0dt (我們) = 2496.0使用 GS_timing計(jì)時延遲微秒(1)dt (我們) = 572.0dt (我們) = 673.0dt (我們) = 1084.0dt (我們) = 561.0dt (我們) = 728.0dt (我們) = 576.0dt (我們) = 556.0dt (我們) = 584.0dt (我們) = 576.0dt (我們) = 578.0計(jì)時延遲微秒(2000)dt (我們) = 2741.0dt (我們) = 2466.0dt (我們) = 2522.0dt (我們) = 2810.0dt (我們) = 2589.0dt (我們) = 2681.0dt (我們) = 2546.0dt (我們) = 3090.0dt (我們) = 2600.0dt (我們) = 2400.0
相關(guān):
- 我的 3 組時間戳函數(shù)(相互交叉):
- 對于 C 時間戳,請?jiān)诖颂幉榭次业拇鸢?以微秒為單位獲取 C 中的時間戳?
- 對于 C++ 高分辨率時間戳,請?jiān)诖颂幉榭次业拇鸢?在 C++ 中獲取準(zhǔn)確的執(zhí)行時間(微秒)
- 對于 Python 高分辨率時間戳,請?jiān)诖颂幉榭次业拇鸢?如何在 Python 中獲取毫秒和微秒分辨率的時間戳?
How do I get millisecond and microsecond-resolution timestamps in Python?
I'd also like the Arduino-like delay()
(which delays in milliseconds) and delayMicroseconds()
functions.
I read other answers before asking this question, but they rely on the time
module, which prior to Python 3.3 did NOT have any type of guaranteed resolution whatsoever. Its resolution is all over the place. The most upvoted answer here quotes a Windows resolution (using their answer) of 16 ms, which is 32000 times worse than my answer provided here (0.5 us resolution). Again, I needed 1 ms and 1 us (or similar) resolutions, not 16000 us resolution.
Related:
- my own answer on how to do the same thing (get ms and us-resolution timestamps) in C++
For Windows: Here's a fully-functional module for both Linux (works with pre-Python 3.3 too) and Windows:
Functions and code samples.
Functions include:
- micros()
- millis()
- delay()
- delayMicroseconds()
Python code module:
"""
GS_timing.py
-create some low-level Arduino-like millis() (milliseconds) and micros()
(microseconds) timing functions for Python
By Gabriel Staples
http://www.ElectricRCAircraftGuy.com
-click "Contact me" at the top of my website to find my email address
Started: 11 July 2016
Updated: 13 Aug 2016
History (newest on top):
20160813 - v0.2.0 created - added Linux compatibility, using ctypes, so that it's compatible with pre-Python 3.3 (for Python 3.3 or later just use the built-in time functions for Linux, shown here: https://docs.python.org/3/library/time.html)
-ex: time.clock_gettime(time.CLOCK_MONOTONIC_RAW)
20160711 - v0.1.0 created - functions work for Windows *only* (via the QPC timer)
References:
WINDOWS:
-personal (C++ code): GS_PCArduino.h
1) Acquiring high-resolution time stamps (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx
2) QueryPerformanceCounter function (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644904(v=vs.85).aspx
3) QueryPerformanceFrequency function (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644905(v=vs.85).aspx
4) LARGE_INTEGER union (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/aa383713(v=vs.85).aspx
-*****https://stackoverflow.com/questions/4430227/python-on-win32-how-to-get-
absolute-timing-cpu-cycle-count
LINUX:
-https://stackoverflow.com/questions/1205722/how-do-i-get-monotonic-time-durations-in-python
"""
import ctypes, os
#Constants:
VERSION = '0.2.0'
#-------------------------------------------------------------------
#FUNCTIONS:
#-------------------------------------------------------------------
#OS-specific low-level timing functions:
if (os.name=='nt'): #for Windows:
def micros():
"return a timestamp in microseconds (us)"
tics = ctypes.c_int64()
freq = ctypes.c_int64()
#get ticks on the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))
#get the actual freq. of the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))
t_us = tics.value*1e6/freq.value
return t_us
def millis():
"return a timestamp in milliseconds (ms)"
tics = ctypes.c_int64()
freq = ctypes.c_int64()
#get ticks on the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))
#get the actual freq. of the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))
t_ms = tics.value*1e3/freq.value
return t_ms
elif (os.name=='posix'): #for Linux:
#Constants:
CLOCK_MONOTONIC_RAW = 4 # see <linux/time.h> here: https://github.com/torvalds/linux/blob/master/include/uapi/linux/time.h
#prepare ctype timespec structure of {long, long}
class timespec(ctypes.Structure):
_fields_ =
[
('tv_sec', ctypes.c_long),
('tv_nsec', ctypes.c_long)
]
#Configure Python access to the clock_gettime C library, via ctypes:
#Documentation:
#-ctypes.CDLL: https://docs.python.org/3.2/library/ctypes.html
#-librt.so.1 with clock_gettime: https://docs.oracle.com/cd/E36784_01/html/E36873/librt-3lib.html #-
#-Linux clock_gettime(): http://linux.die.net/man/3/clock_gettime
librt = ctypes.CDLL('librt.so.1', use_errno=True)
clock_gettime = librt.clock_gettime
#specify input arguments and types to the C clock_gettime() function
# (int clock_ID, timespec* t)
clock_gettime.argtypes = [ctypes.c_int, ctypes.POINTER(timespec)]
def monotonic_time():
"return a timestamp in seconds (sec)"
t = timespec()
#(Note that clock_gettime() returns 0 for success, or -1 for failure, in
# which case errno is set appropriately)
#-see here: http://linux.die.net/man/3/clock_gettime
if clock_gettime(CLOCK_MONOTONIC_RAW , ctypes.pointer(t)) != 0:
#if clock_gettime() returns an error
errno_ = ctypes.get_errno()
raise OSError(errno_, os.strerror(errno_))
return t.tv_sec + t.tv_nsec*1e-9 #sec
def micros():
"return a timestamp in microseconds (us)"
return monotonic_time()*1e6 #us
def millis():
"return a timestamp in milliseconds (ms)"
return monotonic_time()*1e3 #ms
#Other timing functions:
def delay(delay_ms):
"delay for delay_ms milliseconds (ms)"
t_start = millis()
while (millis() - t_start < delay_ms):
pass #do nothing
return
def delayMicroseconds(delay_us):
"delay for delay_us microseconds (us)"
t_start = micros()
while (micros() - t_start < delay_us):
pass #do nothing
return
#-------------------------------------------------------------------
#EXAMPLES:
#-------------------------------------------------------------------
#Only executute this block of code if running this module directly,
#*not* if importing it
#-see here: http://effbot.org/pyfaq/tutor-what-is-if-name-main-for.htm
if __name__ == "__main__": #if running this module as a stand-alone program
#print loop execution time 100 times, using micros()
tStart = micros() #us
for x in range(0, 100):
tNow = micros() #us
dt = tNow - tStart #us; delta time
tStart = tNow #us; update
print("dt(us) = " + str(dt))
#print loop execution time 100 times, using millis()
print("
")
tStart = millis() #ms
for x in range(0, 100):
tNow = millis() #ms
dt = tNow - tStart #ms; delta time
tStart = tNow #ms; update
print("dt(ms) = " + str(dt))
#print a counter once per second, for 5 seconds, using delay
print("
start")
for i in range(1,6):
delay(1000)
print(i)
#print a counter once per second, for 5 seconds, using delayMicroseconds
print("
start")
for i in range(1,6):
delayMicroseconds(1000000)
print(i)
If you know how to get the above millisecond and microsecond-resolution timestamps in Linux, please post, as that would be very helpful too.
This works for Linux too, including in pre-Python 3.3, since I'm using C functions via the ctypes module in order to read the time stamps.
(Note: code above originally posted here: http://www.electricrcaircraftguy.com/2016/07/arduino-like-millisecond-and-microsecond-timestamps-in-python.html)
Special thanks to @ArminRonacher for his brilliant pre-Python 3.3 Linux answer here: https://stackoverflow.com/a/1205762/4561887
Update: prior to Python 3.3, the built-in Python time library (https://docs.python.org/3.5/library/time.html) didn't have any explicitly high-resolution functions. Now, however it does provide other options, including some high-resolution functions.
My module above, however, provides high-resolution timestamps for Python code before Python 3.3, as well as after, and it does so on both Linux and Windows.
Here's an example of what I mean, showing that the time.sleep()
function is NOT necessarily a high-resolution function. *On my Windows machine, it's resolution is perhaps 8ms at best, whereas my module above has 0.5us resolution (16000 times better!) on the same machine.
Code demonstration:
import time
import GS_timing as timing
def delayMicroseconds(n):
time.sleep(n / 1000000.)
def delayMillisecond(n):
time.sleep(n / 1000.)
t_start = 0
t_end = 0
#using time.sleep
print('using time.sleep')
print('delayMicroseconds(1)')
for x in range(10):
t_start = timing.micros() #us
delayMicroseconds(1)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
print('delayMicroseconds(2000)')
for x in range(10):
t_start = timing.micros() #us
delayMicroseconds(2000)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
#using GS_timing
print('
using GS_timing')
print('timing.delayMicroseconds(1)')
for x in range(10):
t_start = timing.micros() #us
timing.delayMicroseconds(1)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
print('timing.delayMicroseconds(2000)')
for x in range(10):
t_start = timing.micros() #us
timing.delayMicroseconds(2000)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
SAMPLE RESULTS ON MY WINDOWS 8.1 MACHINE (notice how much worse time.sleep does):
using time.sleep
delayMicroseconds(1)
dt (us) = 2872.059814453125
dt (us) = 886.3939208984375
dt (us) = 770.4649658203125
dt (us) = 1138.7698974609375
dt (us) = 1426.027099609375
dt (us) = 734.557861328125
dt (us) = 10617.233642578125
dt (us) = 9594.90576171875
dt (us) = 9155.299560546875
dt (us) = 9520.526611328125
delayMicroseconds(2000)
dt (us) = 8799.3056640625
dt (us) = 9609.2685546875
dt (us) = 9679.5439453125
dt (us) = 9248.145263671875
dt (us) = 9389.721923828125
dt (us) = 9637.994262695312
dt (us) = 9616.450073242188
dt (us) = 9592.853881835938
dt (us) = 9465.639892578125
dt (us) = 7650.276611328125
using GS_timing
timing.delayMicroseconds(1)
dt (us) = 53.3477783203125
dt (us) = 36.93310546875
dt (us) = 36.9329833984375
dt (us) = 34.8812255859375
dt (us) = 35.3941650390625
dt (us) = 40.010986328125
dt (us) = 38.4720458984375
dt (us) = 56.425537109375
dt (us) = 35.9072265625
dt (us) = 36.420166015625
timing.delayMicroseconds(2000)
dt (us) = 2039.526611328125
dt (us) = 2046.195068359375
dt (us) = 2033.8841552734375
dt (us) = 2037.4747314453125
dt (us) = 2032.34521484375
dt (us) = 2086.2059326171875
dt (us) = 2035.4229736328125
dt (us) = 2051.32470703125
dt (us) = 2040.03955078125
dt (us) = 2027.215576171875
SAMPLE RESULTS ON MY RASPBERRY PI VERSION 1 B+ (notice that the results between using time.sleep and my module are basically identical...apparently the low-level functions in time
are already accessing better-resolution timers here, since it's a Linux machine (running Raspbian)...BUT in my GS_timing
module I am explicitly calling the CLOCK_MONOTONIC_RAW timer. Who knows what's being used otherwise):
using time.sleep
delayMicroseconds(1)
dt (us) = 1022.0
dt (us) = 417.0
dt (us) = 407.0
dt (us) = 450.0
dt (us) = 2078.0
dt (us) = 393.0
dt (us) = 1297.0
dt (us) = 878.0
dt (us) = 1135.0
dt (us) = 2896.0
delayMicroseconds(2000)
dt (us) = 2746.0
dt (us) = 2568.0
dt (us) = 2512.0
dt (us) = 2423.0
dt (us) = 2454.0
dt (us) = 2608.0
dt (us) = 2518.0
dt (us) = 2569.0
dt (us) = 2548.0
dt (us) = 2496.0
using GS_timing
timing.delayMicroseconds(1)
dt (us) = 572.0
dt (us) = 673.0
dt (us) = 1084.0
dt (us) = 561.0
dt (us) = 728.0
dt (us) = 576.0
dt (us) = 556.0
dt (us) = 584.0
dt (us) = 576.0
dt (us) = 578.0
timing.delayMicroseconds(2000)
dt (us) = 2741.0
dt (us) = 2466.0
dt (us) = 2522.0
dt (us) = 2810.0
dt (us) = 2589.0
dt (us) = 2681.0
dt (us) = 2546.0
dt (us) = 3090.0
dt (us) = 2600.0
dt (us) = 2400.0
Related:
- My 3 sets of timestamp functions (cross-linked to each other):
- For C timestamps, see my answer here: Get a timestamp in C in microseconds?
- For C++ high-resolution timestamps, see my answer here: Getting an accurate execution time in C++ (micro seconds)
- For Python high-resolution timestamps, see my answer here: How can I get millisecond and microsecond-resolution timestamps in Python?
這篇關(guān)于如何在 Python 中獲得毫秒和微秒分辨率的時間戳?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!