久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

      <i id='afIaq'><tr id='afIaq'><dt id='afIaq'><q id='afIaq'><span id='afIaq'><b id='afIaq'><form id='afIaq'><ins id='afIaq'></ins><ul id='afIaq'></ul><sub id='afIaq'></sub></form><legend id='afIaq'></legend><bdo id='afIaq'><pre id='afIaq'><center id='afIaq'></center></pre></bdo></b><th id='afIaq'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='afIaq'><tfoot id='afIaq'></tfoot><dl id='afIaq'><fieldset id='afIaq'></fieldset></dl></div>

    1. <small id='afIaq'></small><noframes id='afIaq'>

        <tfoot id='afIaq'></tfoot>

        <legend id='afIaq'><style id='afIaq'><dir id='afIaq'><q id='afIaq'></q></dir></style></legend>
        • <bdo id='afIaq'></bdo><ul id='afIaq'></ul>

        redis - 使用哈希

        redis - Using Hashes(redis - 使用哈希)
        <legend id='e0cfh'><style id='e0cfh'><dir id='e0cfh'><q id='e0cfh'></q></dir></style></legend>
          <tbody id='e0cfh'></tbody>
          • <bdo id='e0cfh'></bdo><ul id='e0cfh'></ul>
            <i id='e0cfh'><tr id='e0cfh'><dt id='e0cfh'><q id='e0cfh'><span id='e0cfh'><b id='e0cfh'><form id='e0cfh'><ins id='e0cfh'></ins><ul id='e0cfh'></ul><sub id='e0cfh'></sub></form><legend id='e0cfh'></legend><bdo id='e0cfh'><pre id='e0cfh'><center id='e0cfh'></center></pre></bdo></b><th id='e0cfh'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='e0cfh'><tfoot id='e0cfh'></tfoot><dl id='e0cfh'><fieldset id='e0cfh'></fieldset></dl></div>

                <tfoot id='e0cfh'></tfoot>

                <small id='e0cfh'></small><noframes id='e0cfh'>

                • 本文介紹了redis - 使用哈希的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學(xué)習(xí)吧!

                  問題描述

                  我正在使用 redis 為我的 Web 應(yīng)用程序?qū)崿F(xiàn)社交流和通知系統(tǒng).我是 redis 新手,我對哈希及其效率有一些疑問.

                  I'm implementing a social stream and a notification system for my web application by using redis. I'm new to redis and I have some doubts about hashes and their efficiency.

                  我讀過這篇很棒的 Instagram 帖子我計劃實施他們類似的解決方案以減少存儲空間.

                  I've read this awesome Instagram post and I planned to implement their similar solution for minimal storage.

                  正如他們的博客中提到的,他們確實喜歡這個

                  As mentioned in their blog, they did like this

                  為了利用散列類型,我們將所有媒體 ID 分桶到 1000 個桶中(我們只取 ID,除以 1000 并丟棄余數(shù)).這決定了我們落入哪個鍵;接下來,在位于該鍵的散列中,媒體 ID 是散列內(nèi)的查找鍵,用戶 ID 是值.舉個例子,給定 Media ID 為 1155315,這意味著它落入桶 1155 (1155315/1000 = 1155):

                  To take advantage of the hash type, we bucket all our Media IDs into buckets of 1000 (we just take the ID, divide by 1000 and discard the remainder). That determines which key we fall into; next, within the hash that lives at that key, the Media ID is the lookup key within the hash, and the user ID is the value. An example, given a Media ID of 1155315, which means it falls into bucket 1155 (1155315 / 1000 = 1155):

                  HSET "mediabucket:1155" "1155315" "939"
                  HGET "mediabucket:1155" "1155315"
                  > "939"
                  

                  因此,他們沒有將 1000 個單獨的鍵存儲在 一個具有數(shù)千個查找鍵的哈希中.我的疑問是為什么我們不能將查找鍵值增加到更大.

                  So Instead of having 1000 seperate keys they are storing it in one hash with thousand lookup keys. And my doubt is why can't we increase the lookup key values to even more larger.

                  例如: 媒體 ID 1155315 除以 10000 將落入 mediabucket:115甚至更大.

                  他們?yōu)槭裁匆褂靡粋€具有 1000 個查找鍵的哈希桶.為什么他們不能有一個具有 100000 個查找鍵的哈希桶.這與效率有關(guān)嗎?

                  Why are they settling with one hash bucket with 1000 lookup keys. Why can't they have one hash bucket with 100000 lookup keys. Is that related to efficiency?

                  我需要您的建議,以便在我的 Web 應(yīng)用程序中實施有效的方法.

                  I need your suggestion for implementing the efficient method in my web application.

                  附:請!不要說stackoverflow不是用來提建議的,我不知道去哪里尋求幫助.

                  P.S. Please! don't say that stackoverflow is not for asking suggestions and I don't know where to find help.

                  謝謝!

                  推薦答案

                  是的,和效率有關(guān).

                  我們向總是樂于助人的 Redis 核心開發(fā)人員之一 Pieter Noordhuis 征求意見,他建議我們使用 Redis 哈希.Redis 中的哈希是可以非常有效地在內(nèi)存中編碼的字典;Redis 設(shè)置hash-zipmap-max-entries"配置哈??梢跃哂械淖畲髼l目數(shù),同時仍然可以有效編碼.我們發(fā)現(xiàn)這個設(shè)置最好在 1000 左右;更高,HSET 命令將導(dǎo)致明顯的 CPU 活動.更多詳細(xì)信息,您可以查看 zipmap 源文件.

                  We asked the always-helpful Pieter Noordhuis, one of Redis’ core developers, for input, and he suggested we use Redis hashes. Hashes in Redis are dictionaries that are can be encoded in memory very efficiently; the Redis setting ‘hash-zipmap-max-entries’ configures the maximum number of entries a hash can have while still being encoded efficiently. We found this setting was best around 1000; any higher and the HSET commands would cause noticeable CPU activity. For more details, you can check out the zipmap source file.

                  小散列以一種特殊的方式(zipmaps)進行編碼,即節(jié)省內(nèi)存,但操作 O(N) 而不是 O(1).因此,使用一個包含 100k 字段的 zipmap 而不是 100 個包含 1k 字段的 zipmap,您不會獲得內(nèi)存優(yōu)勢,但您的所有操作都會慢 100 倍.

                  Small hashes are encoded in a special way (zipmaps), that is memory efficient, but makes operations O(N) instead of O(1). So, with one zipmap with 100k fields instead of 100 zipmaps with 1k fields you gain no memory benefits, but all your operations get 100 times slower.

                  這篇關(guān)于redis - 使用哈希的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!

                  【網(wǎng)站聲明】本站部分內(nèi)容來源于互聯(lián)網(wǎng),旨在幫助大家更快的解決問題,如果有圖片或者內(nèi)容侵犯了您的權(quán)益,請聯(lián)系我們刪除處理,感謝您的支持!

                  相關(guān)文檔推薦

                  python: Two modules and classes with the same name under different packages(python:不同包下同名的兩個模塊和類)
                  Configuring Python to use additional locations for site-packages(配置 Python 以使用站點包的其他位置)
                  How to structure python packages without repeating top level name for import(如何在不重復(fù)導(dǎo)入頂級名稱的情況下構(gòu)造python包)
                  Install python packages on OpenShift(在 OpenShift 上安裝 python 包)
                  How to refresh sys.path?(如何刷新 sys.path?)
                  Distribute a Python package with a compiled dynamic shared library(分發(fā)帶有已編譯動態(tài)共享庫的 Python 包)

                      <legend id='nerBP'><style id='nerBP'><dir id='nerBP'><q id='nerBP'></q></dir></style></legend>
                      <tfoot id='nerBP'></tfoot>

                          <tbody id='nerBP'></tbody>
                      • <small id='nerBP'></small><noframes id='nerBP'>

                          <bdo id='nerBP'></bdo><ul id='nerBP'></ul>
                        • <i id='nerBP'><tr id='nerBP'><dt id='nerBP'><q id='nerBP'><span id='nerBP'><b id='nerBP'><form id='nerBP'><ins id='nerBP'></ins><ul id='nerBP'></ul><sub id='nerBP'></sub></form><legend id='nerBP'></legend><bdo id='nerBP'><pre id='nerBP'><center id='nerBP'></center></pre></bdo></b><th id='nerBP'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='nerBP'><tfoot id='nerBP'></tfoot><dl id='nerBP'><fieldset id='nerBP'></fieldset></dl></div>

                          • 主站蜘蛛池模板: 一区二区免费视频 | 欧美成人一区二免费视频软件 | 国产成人精品综合 | 亚洲成人精 | 欧美不卡在线 | 色资源av| 色综合一区二区三区 | 亚洲欧美中文日韩在线v日本 | 国产免费一区二区 | 一区二区三区四区国产 | 成人久久网 | 一级黄色影片在线观看 | 91福利网址| 亚洲欧美一区二区三区1000 | 精品一区av | 日韩欧美在线观看视频网站 | 免费国产网站 | 久久综合成人精品亚洲另类欧美 | 高清欧美性猛交 | 在线国产一区 | 久久国色| 久久精品伊人 | 久久国产精品-国产精品 | 日韩久久久久 | 黄色一级大片在线免费看产 | 免费久久网 | 久久99精品国产自在现线小黄鸭 | 国产精品爱久久久久久久 | 精品国产乱码久久久久久丨区2区 | 国产精品视频区 | 91视频在线网站 | 亚洲成av人影片在线观看 | 欧美看片| 久久精彩视频 | 福利社午夜影院 | 成人黄色在线观看 | 亚洲免费一区 | 亚洲一区二区在线视频 | 四虎影视一区二区 | 国产精品久久久久久久久久久免费看 | 一区中文字幕 |