Performance considerations of a large hard-coded array in the .cs file

Posted by terence on Stack Overflow See other posts from Stack Overflow or by terence
Published on 2012-12-07T04:18:01Z Indexed on 2012/12/07 5:04 UTC
Read the original article Hit count: 147

Filed under:
|
|
|
|

I'm writing some code where performance is important. In one part of it, I have to compare a large set of pre-computed data against dynamic values. Currently, I'm storing that pre-computed data in a giant array in the .cs file:

Data[] data = { /* my data  set */ };

The data set is about 90kb, or roughly 13k elements. I was wondering if there's any downside to doing this, as opposed to loading it in from an external file? I'm not entirely sure how C# works internally, so I just wanted to be aware of any performance issues I might encounter with this method.

© Stack Overflow or respective owner

Related posts about c#

Related posts about .NET