亚洲香蕉成人av网站在线观看_欧美精品成人91久久久久久久_久久久久久久久久久亚洲_热久久视久久精品18亚洲精品_国产精自产拍久久久久久_亚洲色图国产精品_91精品国产网站_中文字幕欧美日韩精品_国产精品久久久久久亚洲调教_国产精品久久一区_性夜试看影院91社区_97在线观看视频国产_68精品久久久久久欧美_欧美精品在线观看_国产精品一区二区久久精品_欧美老女人bb

首頁 > 學院 > 開發設計 > 正文

TheDangersoftheLargeObjectHeap(轉載,LOH內存碎片情景重現)

2019-11-14 16:22:20
字體:
來源:轉載
供稿:網友

原文地址:https://www.simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/

 

You'd have thought that memory leaks were a thing of the past now that we use .NET. True, but we can still hit PRoblems. We can, for example, prevent memory from being recycled if we inadvertently hold references to objects that we are no longer using. However, there is another serious memory problem in .NET that can happen out of the blue, especially if you are using large object arrays. Andrew Hunter explains...

Usually, .NET developers don’t need to think too much about how their objects are being laid out in physical memory: after all, .NET has a garbage collector and so is capable of automatically removing ‘dead’ objects and rearranging memory so that the survivors are packed in the most efficient manner possible. The garbage collector has limits to what it can do, however; and when these limits are reached, then the runtime can exhaust memory in a way that is surprising and confusing to any developer who is not aware of how .NET chooses to lay out objects in memory.

.NET manages memory in four different regions, known as heaps. You can think of each of these as continuous regions of memory, though in practice .NET can create several fragmented regions for each heap. Three of the heaps, called the generation 0, 1 and 2 heaps are reserved for small objects: In current versions of .NET ‘small’ means objects that are 85,000 bytes or less. Any object is assigned to one of these generations according to the number of garbage collections it has survived, the veterans ending in generation 2. .NET knows that younger objects are more likely to be short lived and can reduce the performance cost of garbage collections by initially only looking at the recently allocated objects on generation 0. Perhaps more importantly, it can also move the survivors of a collection around so that there are no gaps, ensuring that the free space available for new objects is always together in one large lump. This helps with performance - .NET never needs to search for a hole big enough for a new object, unlike unmanaged code: if there’s enough memory available it’s always in the same place. When it needs to compact a series of objects, .NET actually copies all of them to a new block of memory rather than moving them in place; this improves performance by simplifying how objects are allocated. In these small heaps this means that the free space is always at the end, so there is never any need to scan elsewhere for a ‘hole’ big enough to store a new object.

The fourth heap is known as the Large Object Heap, or LOH. ‘Big’ objects go here - as the size at which an object may end up on this heap is 85,000 bytes, this usually means arrays with more than about 20,000 entries. It’s treated separately by the garbage collector, which will generally try to reclaim space from the other heaps before trying to tackle the giant objects that lurk here. Large objects pose a special problem for the runtime: they can’t be reliably moved by copying as they would require twice as much memory for garbage collection. Additionally, moving multi-megabyte objects around would cause the garbage collector to take an unreasonably long time to complete.

.NET solves these problems by simply never moving large objects around. After large objects are removed by the garbage collector, they leave behind holes in the large object heap, thereby causing the free space to become fragmented. When there’s no space at the end of the large object heap for an allocation, .NET searches these holes for a space, and expands the heap if none of the holes are large enough. This can become a problem. As a program allocates and releases blocks from the large object heap, the free blocks between the longer-lived blocks can become smaller. Over time, even a program that does not leak memory, and which never requires more than a fixed amount of memory to perform an Operation, can fail with an OutOfMemoryException at the point that the largest free block shrinks to a point where it is too small for the program to use. In the worst cases, the amount of usable memory can shrink at a rate that seems unbelievable.

The worst case scenario

The design of the Large Object Heap means that its’ worse case will occur when the short-lived arrays are large and can vary in size a bit, and if the longer-lived arrays are small. The variation in size makes it likely that, when a new short-lived array is allocated, it won’t fit into the space left by one of its predecessors; .NET will then have to extend the LOH to ever-greater extents. This might not seem so bad: surely the smaller blocks can be put in the large gaps that are left behind?

Well, they can, but there’s a catch. As mentioned previously, .NET prefers to put new memory blocks at the end of the heap, as it has to do less work to find a space where a block will fit: This means that when there’s a new large block on the heap, the next smaller block will be allocated immediately after it, with the consequence that it will have to further extend the heap if the program ever wants to allocate a larger block.

What does this look like in a practical program? It seems fairly esoteric: allocate something big, then something small, keep the small thing and throw away the big thing. There are a few scenarios where this pattern can appear, and it can drastically reduce the amount of memory available to a program. The most trivial scenario could just be this:

List < int > list;

...

list.Add(x);

A List object’s underlying implementation is an array and when it needs to grow, .NET allocates a new, larger array and throws the old one away. All that’s needed to start to cause problems with fragmentation is to put some other object on the LOH before the array needs to grow. The main effect here is that you might be surprised how small the size is that your ‘large’ list can grow to before .NET tells you that it has run out of memory - long before the program is using up its 2Gb theoretical maximum.

#

Perhaps a more interesting case is this: a program is designed to read in a large data file, one that varies in size but is usually around 16Mb or so. It reads this into memory, processes it, and then produces a 90k summary. It is later adapted to run as a batch job: processing many of these files and storing the results in memory. Quite a lot of data processing applications can look a bit like this. You might expect that it could process tens of thousands of files before memory becomes a problem: after all only the 90k summary is kept between files, but testing surprises us with failure after only a few hundred files, by using up much more memory than expected.

The following program demonstrates this case. It fills up a list with 90,000 byte arrays and prints out how many bytes were allocated before .NET throws an OutOfMemoryException. It can also, optionally, allocate larger arrays between these allocations. These are short-lived, so they don’t look like they should significantly affect the number of smaller arrays that can be allocated. Unfortunately, each time the program allocates a larger array it allocates one more byte than last time.

using System;

 

using System.Collections.Generic;

using System.Linq;

using System.Text;

 

namespace LOH_test

{

    class Program

    {

        /// <summary>

        /// Static variable used to store our 'big' block. This ensures that the block is always up for garbage collection.

        /// </summary>

        static byte[] bigBlock;

 

        /// <summary>

        /// Allocates 90,000 byte blocks, optionally intersperced with larger blocks

        /// </summary>

        static void Fill(bool allocateBigBlocks, bool grow, bool alwaysGC)

        {

            // Number of bytes in a small block

            // 90000 bytes, just above the limit for the LOH

            const int blockSize = 90000;

 

            // Number of bytes in a larger block: 16Mb initially

            int largeBlockSize = 1 << 24;

 

            // Number of small blocks allocated

            int count = 0;

 

            try

            {

                // We keep the 'small' blocks around

                   // (imagine an algorithm that allocates memory in chunks)

                List<byte[]> smallBlocks = new List<byte[]>();

 

                for (; ;)

                {

                    // Write out some status information

                    if ((count % 1000) == 0)

                    {

                        Console.CursorLeft = 0;

                        Console.Write(new string(' ', 20));

                        Console.CursorLeft = 0;

                        Console.Write("{0}", count);

                        Console.CursorLeft = 0;

                    }

 

                    // Force a GC if necessaryry

                    if (alwaysGC) GC.Collect();

 

                    // Allocate a larger block if we're set up to do soso

                    if (allocateBigBlocks)

                    {

                        bigBlock = new byte[largeBlockSize];

                    }

 

                    // The next 'large' block will be just slightly largerer

                    if (grow) largeBlockSize++;

 

                    // Allocate a new block

                    smallBlocks.Add(new byte[blockSize]);

                    count++;

                }

            }

            catch (OutOfMemoryException)

            {

                // Force a GC, which should empty the LOH again

                bigBlock = null;

                GC.Collect();

 

                // Display the results for the amount of memory we managed to allocate

                Console.WriteLine("{0}: {1}Mb allocated"

                                  , (allocateBigBlocks ? "With large blocks" : "Only small blocks")

                                  + (alwaysGC?", frequent garbage collections":"")

                                  + (grow?"":", large blocks not growing")

                                  , (count * blockSize) / (1024 * 1024));

            }

        }

 

        static void Main(string[] args)

        {

            // Display results for cases both with and without the larger blocks

            Fill(true, true, false);

            Fill(true, true, true);

            Fill(false, true, false);

            Fill(true, false, false);

 

            Console.ReadLine();

        }

    }

}

When run on a 32-bit Vista system, the program gives the following result:

Oh dear. When the program tries to allocate the larger blocks, it’s a disaster - it only manages to end up using around 20Mb of long-term storage, out of a theoretical maximum of 2GB! Forcing garbage collections barely helps at all. Even in the case where the larger blocks don’t grow the amount of memory the program can use is reduced by over half. If the large blocks are taken away, the program easily creates 1.6GB of data.

The last case in which the blocks don’t grow, but the amount of available memory is still reduced, is interesting. When the LOH fills up and the program wants to allocate a new small block, .NET tries to re-use free space before it grows the heap: this means that the space where the larger block was previously being put becomes too small, and the LOH has to grow in a similar way to before. It’s not as critical, but it does still result in the amount of available memory slowly going down.

Switching to a 64-bit system will improve matters: the program will be able to use up a lot more memory before it fails, but perhaps the situation is worse - because that memory is assigned to the program, the operating system will be paging it all out to disk resulting in considerably reduced performance for every program on the system and perhaps eventually running out of disk space.

It is difficult to diagnose this problem with the tools that are currently available. The best approach available at the moment is to combine a memory profiler with the performance counters. You can look at the ‘Large Object Heap Size’ performance counter to discover how many bytes are allocated to the large object heap. This counter includes free space, so to discover whether or not the problem is that memory has run out or because the memory that is available has become too fragmented to be used you will need to use a memory profiler. A good profiler will be able to tell you the total size of all of the live objects on the large object heap. If there is a big discrepancy between the two numbers, then it is possible that your program is suffering from heap fragmentation. Unfortunately, this isn’t guaranteed: there’s no problem if there’s a large amount of free space at the end of the heap - .NET just won’t clean it up until it has to for performance reasons.

Working around the problem

It might seem that nothing can be done to solve this problem apart from periodically stopping and restarting the afflicted program. This is probably the case if there is really no choice other than to deal with very large arrays of objects (or very large strings), but often it is possible to reduce or eliminate the need to use the large object heap, or to use it in such a way as to prevent fragmentation from occurring.

The best way to prevent fragmentation from occurring is to ensure that no objects above 85k in size are used for permanent storage. This ensures that the free space in the large object heap will always end up in one big chunk as short-lived objects are removed from play.

If large data structures need to live for a long time, and especially if they need to grow in size over time, the best approach is simply to consider using or writing a different data structure to store them. Arrays can contain up to around 10,000 elements before they are put on the large object heap and can cause problems, so a very effective way to store 100,000 entries might be to store 10 arrays each containing 10,000 elements: none will end up on the large object heap so no fragmentation will occur. This could be written as an IList subclass, which would make it easy to drop in transparently to replace existing code.

Depending on the design of the code, a simpler approach might simply be to make the ‘temporary’ but large structures more permanent. If a program needs to deal with a set of large files, it will make more efficient use of memory if it allocates enough space to deal with the largest one first, and then re-uses that for every file instead of allocating just enough for each and then throwing the memory away later. Many of the classes in System.Collections have a ‘Capacity’ property to facilitate this design pattern, and thread local static variables can be used (with care) to help share a large data structure in multithreaded applications.

It is unfortunate that the CLR garbage collector deals with large objects in this manner. The effect is similar to a memory leak, but one where we get no clue as to the cause by measuring the size of the objects in memory: a memory profiler will tell you that there’s no problem, while task manager tells you that the program’s memory usage is growing ever larger – which could be due to this issue or which could be due to the behavior of unmanaged code. What is needed instead is a way of finding out where objects are in memory and how they are affecting the free memory available to .NET. This can be done with some detective work: code can be inspected for areas where large amounts of memory are likely to be allocated, and a memory profiler can be used to help discover these areas. Once these are known, a debugger can be used to step through the code and some pen and paper can give an idea of what the large object heap looks like, which can be used to decide on the best way forward.

There is a feature on the drawing board for ANTS Memory Profiler 5 that will help diagnose this issue and distinguish it from an unmanaged memory leak, and there are plans for features in future versions that will help isolate the specific cause once the issue is diagnosed.


發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
亚洲香蕉成人av网站在线观看_欧美精品成人91久久久久久久_久久久久久久久久久亚洲_热久久视久久精品18亚洲精品_国产精自产拍久久久久久_亚洲色图国产精品_91精品国产网站_中文字幕欧美日韩精品_国产精品久久久久久亚洲调教_国产精品久久一区_性夜试看影院91社区_97在线观看视频国产_68精品久久久久久欧美_欧美精品在线观看_国产精品一区二区久久精品_欧美老女人bb
日韩免费观看在线观看| 亚洲a在线播放| 欧美精品久久久久久久免费观看| 日韩视频免费在线| 欧美性xxxxx| 国产aⅴ夜夜欢一区二区三区| 91久久国产婷婷一区二区| 亚洲午夜女主播在线直播| 亚洲淫片在线视频| 欧美精品videos性欧美| 欧美华人在线视频| 一个人看的www欧美| 91黄色8090| 欧美一级视频一区二区| 日韩激情第一页| 中文国产成人精品久久一| 久久久精品国产一区二区| 日韩福利伦理影院免费| 久久久久久久国产精品视频| 97视频在线播放| 亚洲激情视频网| 日韩精品久久久久久福利| 韩国福利视频一区| 亚洲在线观看视频网站| 亚洲影视中文字幕| 中文字幕亚洲一区| 色老头一区二区三区| 亚洲国产成人精品久久| 清纯唯美亚洲激情| 国产精品久久久久久久久久ktv| 色综合五月天导航| 国产欧美精品一区二区三区介绍| 96精品久久久久中文字幕| 欧美成人一区二区三区电影| 亚洲一级一级97网| 日韩免费av片在线观看| 国产成人鲁鲁免费视频a| 亚洲va欧美va国产综合久久| 草民午夜欧美限制a级福利片| 国产欧洲精品视频| 国产精品久久久久久婷婷天堂| 久久韩国免费视频| 久久视频中文字幕| 中文字幕亚洲欧美日韩2019| 亚洲香蕉成视频在线观看| 成人在线观看视频网站| 欧美精品在线免费| 精品久久香蕉国产线看观看gif| 亚洲成年网站在线观看| 成人福利视频在线观看| 久久精品电影一区二区| 亚洲国产日韩欧美在线99| 欧美日韩国内自拍| 欧美一级高清免费播放| 青青草一区二区| 欧美激情一级欧美精品| 色偷偷噜噜噜亚洲男人的天堂| 日韩欧美在线中文字幕| 中文字幕在线视频日韩| 国产亚洲精品久久久久动| 欧美高清视频在线| 一区二区亚洲欧洲国产日韩| 欧美一级片一区| 欧美一区二区色| 欧美激情第1页| 欧美亚洲成人精品| 日韩av高清不卡| 精品福利一区二区| 久久久免费在线观看| 久久精品视频在线观看| 清纯唯美日韩制服另类| 国产视频精品xxxx| 久久久欧美精品| 亚洲精品日韩在线| 91精品在线影院| 久久伊人精品视频| 日韩av毛片网| 欧美一级大片视频| 欧美一级电影在线| 日韩av电影在线播放| 91手机视频在线观看| 国产日本欧美一区二区三区| 在线免费观看羞羞视频一区二区| 欧美在线视频a| 成人高清视频观看www| 中文字幕自拍vr一区二区三区| 欧美色另类天堂2015| 亚洲视频axxx| 色偷偷亚洲男人天堂| 国产精品爽黄69天堂a| 亚洲精品电影在线观看| 日韩高清人体午夜| 国产美女久久精品香蕉69| 色婷婷亚洲mv天堂mv在影片| 国产精品户外野外| 亚洲色图第三页| 亚洲色图美腿丝袜| 国产精品丝袜久久久久久高清| 91在线|亚洲| 欧美日本高清视频| 一区二区三区四区在线观看视频| 91精品一区二区| 欧美日韩爱爱视频| 91国内在线视频| 富二代精品短视频| 岛国av一区二区在线在线观看| 91精品久久久久久久久久入口| 久久久久久亚洲| 久久色在线播放| 国产精品18久久久久久麻辣| 国产精品福利片| 日韩电影免费在线观看| 色偷偷综合社区| 91老司机在线| 韩剧1988免费观看全集| 欧美大片va欧美在线播放| 91丝袜美腿美女视频网站| 欧美性受xxxx黑人猛交| 亚洲国产成人爱av在线播放| 亚洲成人免费网站| 精品中文字幕乱| 日韩在线国产精品| 精品久久香蕉国产线看观看gif| 亚洲成人xxx| 亚洲精品国产精品国自产观看浪潮| 久久久精品国产亚洲| 亚洲字幕一区二区| 久久精品国产成人精品| 亚洲在线免费视频| 日韩动漫免费观看电视剧高清| 欧美日韩亚洲视频| 欧美日韩国产成人在线观看| 精品久久久久久亚洲精品| 精品女厕一区二区三区| 国产精品夫妻激情| 久久久国产精品视频| 欧美成人精品在线视频| 日韩欧美极品在线观看| 国产成人高清激情视频在线观看| 久久久久中文字幕2018| 久久久电影免费观看完整版| 欧美精品18videos性欧美| 国产精品久久久久久久久男| 国产国语videosex另类| 在线观看国产成人av片| 免费av在线一区| 97视频国产在线| 久久久在线免费观看| 国产偷亚洲偷欧美偷精品| 亚洲视频一区二区三区| 国产精品福利在线观看网址| 精品国产欧美一区二区五十路| 欧美黄色片免费观看| 国内精品一区二区三区| 日韩激情视频在线播放| 俺去了亚洲欧美日韩| 国产午夜精品麻豆| 91精品国产91久久久久久| 插插插亚洲综合网| 亚洲爱爱爱爱爱| 久久影视电视剧免费网站清宫辞电视| 国产精品成人一区二区| 最近2019中文字幕第三页视频| 亚洲福利视频在线|