久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

    <bdo id='dWcyX'></bdo><ul id='dWcyX'></ul>
  • <i id='dWcyX'><tr id='dWcyX'><dt id='dWcyX'><q id='dWcyX'><span id='dWcyX'><b id='dWcyX'><form id='dWcyX'><ins id='dWcyX'></ins><ul id='dWcyX'></ul><sub id='dWcyX'></sub></form><legend id='dWcyX'></legend><bdo id='dWcyX'><pre id='dWcyX'><center id='dWcyX'></center></pre></bdo></b><th id='dWcyX'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='dWcyX'><tfoot id='dWcyX'></tfoot><dl id='dWcyX'><fieldset id='dWcyX'></fieldset></dl></div>

      1. <tfoot id='dWcyX'></tfoot>

        <small id='dWcyX'></small><noframes id='dWcyX'>

        <legend id='dWcyX'><style id='dWcyX'><dir id='dWcyX'><q id='dWcyX'></q></dir></style></legend>
      2. Solr Filter Cache (FastLRUCache) 占用內存過多導致內存

        Solr Filter Cache (FastLRUCache) takes too much memory and results in out of memory?(Solr Filter Cache (FastLRUCache) 占用內存過多導致內存不足?)

        <legend id='d1DkK'><style id='d1DkK'><dir id='d1DkK'><q id='d1DkK'></q></dir></style></legend>
      3. <small id='d1DkK'></small><noframes id='d1DkK'>

          • <bdo id='d1DkK'></bdo><ul id='d1DkK'></ul>
          • <i id='d1DkK'><tr id='d1DkK'><dt id='d1DkK'><q id='d1DkK'><span id='d1DkK'><b id='d1DkK'><form id='d1DkK'><ins id='d1DkK'></ins><ul id='d1DkK'></ul><sub id='d1DkK'></sub></form><legend id='d1DkK'></legend><bdo id='d1DkK'><pre id='d1DkK'><center id='d1DkK'></center></pre></bdo></b><th id='d1DkK'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='d1DkK'><tfoot id='d1DkK'></tfoot><dl id='d1DkK'><fieldset id='d1DkK'></fieldset></dl></div>
          • <tfoot id='d1DkK'></tfoot>
              <tbody id='d1DkK'></tbody>

                  本文介紹了Solr Filter Cache (FastLRUCache) 占用內存過多導致內存不足?的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  我有一個 Solr 設置.一個master和兩個slave用于復制.我們的索引中有大約 7000 萬份文檔.從站有 16 GB 的 RAM.10GB 用于 OS 和 HD,6GB 用于 Solr.

                  I have a Solr setup. One master and two slaves for replication. We have about 70 Millions documents in index. The slaves have 16 GBs of RAM. 10GBs for the OS and HD, 6GBs for Solr.

                  但有時,奴隸們會失憶.當我們在內存不足之前下載轉儲文件時,我們可以看到該類:

                  But from time to time, the slaves are out of memory. When we downloaded the dump file just before it was out of memory, we could see that the class :

                  org.apache.solr.util.ConcurrentLRUCache$Stats @ 0x6eac8fb88
                  

                  正在使用高達 5Gb 的內存.我們廣泛使用過濾器緩存,它有 93% 的命中率.這是 solrconfig.xml 中過濾器緩存的 xml

                  is using up to 5Gb of memory. We are using filter caches extensively, it has a 93% hit ratio. And here's the xml for the filter cache in solrconfig.xml

                  <property name="filterCache.size" value="2000" />
                  <property name="filterCache.initialSize" value="1000" />
                  <property name="filterCache.autowarmCount" value="20" />
                  
                  <filterCache class="solr.FastLRUCache"
                               size="${filterCache.size}"
                               initialSize="${filterCache.initialSize}"
                               autowarmCount="${filterCache.autowarmCount}"/>
                  

                  查詢結果具有相同的設置,但使用的是 LRUCache,它只使用了大約 35mb 的內存.是不是配置有問題需要修復,還是我只是需要更多內存用于過濾器緩存?

                  The query results have the same settings, but is using the LRUCache and it only uses about 35mb of the memory. Is there something wrong with the configuration which needs to be fixed, or do I just need more memory for the filter cache?

                  推薦答案

                  在朋友告訴我過濾器緩存的工作原理之后,我們就明白為什么我們時不時會出現內存不足的錯誤.

                  After a friend told me how roughly the filter caches works, it become clear why we get out of memory errors from time to time.

                  那么過濾器緩存有什么作用呢?基本上它會創建一個類似于位數組的東西,告訴哪些文檔與過濾器匹配.一些類似的東西:

                  So what does the filter cache do? Basically it creates something like a bit array which tell which documents matched the filter. Some something like:

                  cache = [1, 0, 0, 1, .. 0]
                  

                  1 表示命中,0 表示沒有命中.因此,對于示例,這意味著過濾器緩存匹配第 0 個和第 3 個文檔.所以緩存有點像一個位數組,具有總文檔的長度.所以假設我有 5000 萬個文檔,那么數組長度將是 5000 萬,這意味著一個過濾器緩存將占用 ram 中的 50.000.000 位.

                  1 means it hits, and 0 means no hit. So for the example, it means the filter cache matches the 0th and 3rd documents. So a cache is kind of like an array of bit, with the length of the total documents. So let's say I have 50 millions docs, so the array length will be 50 millions, which means one filter cache will take up 50.000.000 bit in the ram.

                  所以我們指定我們想要 2000 個過濾器緩存,這意味著它將占用的 RAM 大致為:

                  So we specified we want 2000 filter cache, it means the RAM it will take is roughly:

                  50.000.000 * 2000 = 100.000.000.000 bit 
                  

                  如果將其轉換為 Gb.它將是:

                  If you convert it to Gb. It will be:

                  100.000.000.000 bit / 8 (to byte) / 1000 (to kb) / 1000 (to mb) / 1000 (to gb) = 12,5 Gb
                  

                  因此,僅過濾器緩存所需的總 RAM 大約為 12Gb.這意味著如果 Solr 只有 6Gb 堆空間,它將無法創建 2000 個過濾器緩存.

                  So the total RAM needed just by the filter cache is roughly 12Gb. And it means if the Solr only have 6Gb Heap Space, it will not be able to create 2000 filter caches.

                  是的,我知道 Solr 并不總是創建這個數組,如果過濾器查詢的結果很低,它可以創建其他占用更少內存的東西.如果內存中有 2000 個緩存,這個計算只是大致說明了過濾器緩存的上限是多少.在其他更好的情況下,它可能會更低.

                  Yes, I know Solr doesn't always create this array, and if the result of the filter query is low, it can just create something else which take up less memory. This calculation just says roughly how much the upper limit of the filter cache is, if it has 2000 caches in the ram. It can be lower in other better cases.

                  因此,一種解決方案是減少 solr 配置中的最大過濾器緩存數.我們檢查了 solr stats,大多數時候我們只有大約 600 個過濾器緩存,因此我們可以將過濾器緩存數量減少到最大.

                  So one solution is to lower the number of maximum filter caches in solr config. We checked solr stats, most of the time we only have about 600 filter caches, so we can reduce the filter caches number to that as the maximum.

                  另一種選擇當然是添加更多 RAM.

                  Another option is to of course add more RAM.

                  這篇關于Solr Filter Cache (FastLRUCache) 占用內存過多導致內存不足?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  How can I detect integer overflow on 32 bits int?(如何檢測 32 位 int 上的整數溢出?)
                  Local variables before return statements, does it matter?(return 語句之前的局部變量,這有關系嗎?)
                  How to convert Integer to int?(如何將整數轉換為整數?)
                  How do I create an int array with randomly shuffled numbers in a given range(如何在給定范圍內創建一個隨機打亂數字的 int 數組)
                  Inconsistent behavior on java#39;s ==(java的行為不一致==)
                  Why is Java able to store 0xff000000 as an int?(為什么 Java 能夠將 0xff000000 存儲為 int?)
                  <i id='mz9t3'><tr id='mz9t3'><dt id='mz9t3'><q id='mz9t3'><span id='mz9t3'><b id='mz9t3'><form id='mz9t3'><ins id='mz9t3'></ins><ul id='mz9t3'></ul><sub id='mz9t3'></sub></form><legend id='mz9t3'></legend><bdo id='mz9t3'><pre id='mz9t3'><center id='mz9t3'></center></pre></bdo></b><th id='mz9t3'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='mz9t3'><tfoot id='mz9t3'></tfoot><dl id='mz9t3'><fieldset id='mz9t3'></fieldset></dl></div>

                      <tfoot id='mz9t3'></tfoot>

                        <small id='mz9t3'></small><noframes id='mz9t3'>

                        • <legend id='mz9t3'><style id='mz9t3'><dir id='mz9t3'><q id='mz9t3'></q></dir></style></legend>

                            <bdo id='mz9t3'></bdo><ul id='mz9t3'></ul>
                              <tbody id='mz9t3'></tbody>
                            主站蜘蛛池模板: 日韩欧美精品一区二区 | 国产小视频在线观看 | 中文字字幕码一二三区 | 日日夜夜艹 | 五月亚洲 | 亚洲福利视频一区 | www.男人天堂 | 特级毛片爽www免费版 | 欧美黄色片视频 | 99在线播放 | www.色婷婷| 国产不卡视频 | 午夜在线观看视频 | 国产不卡视频在线观看 | 俺去俺来也在线www色官网 | 欧美网站在线观看 | 欧美日韩第一页 | 青青草视频在线观看 | 久久亚洲成人 | 欧美一区二区在线 | 欧美日韩一区二 | 色妞网站 | 欧美一区二区三区在线观看 | 日本精品国产 | 一区二区三区影院 | 欧美精品二区 | 国产精品欧美在线 | 亚洲字幕| 免费观看全黄做爰的视频 | 久久精品99久久久久久 | 精品一区久久 | 中文字幕在线网站 | 欧美日韩亚洲一区 | 国产欧美精品一区二区 | 91久久国产综合久久91精品网站 | 中国黄色1级片 | 亚洲精品少妇 | 1级黄色片 | 黄色直接看 | 男女啪啪免费网站 | 国产精品自拍小视频 |