Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 267/3203 | < Previous Page | 263 264 265 266 267 268 269 270 271 272 273 274  | Next Page >

  • zfs raidz1-0 corrupted data, all disk are online

    - by gkcr2d2
    my zfs pool "datas" is unavailable but all my disk are online, Do you know how the problem can be fixed? Thank you for your help. root@oxygen:~# zpool status pool: datas state: UNAVAIL scan: none requested config: NAME STATE READ WRITE CKSUM datas UNAVAIL 0 0 0 insufficient replicas raidz1-0 UNAVAIL 0 0 0 corrupted data scsi-SATA_ST3000DM001-9YN_W1F0LBVX ONLINE 0 0 0 scsi-SATA_ST3000DM001-9YN_W1F0KYX9 ONLINE 0 0 0 scsi-SATA_ST3000DM001-9YN_W1F0LCC8 ONLINE 0 0 0 scsi-SATA_ST3000DM001-9YN_W1F0LBXZ ONLINE 0 0 0

    Read the article

  • Updated Master Data Services Documentation and Resources

    - by mattande
    (This post was contributed by Reagan Templin, Lead Technical Writer for the MDS Team) With the release of SQL Server 2008 R2, it’s a great time to check out the updated documentation and resources for the release, and for SQL Server 2008 R2 Master Data Services ("MDS") in particular. As you saw in the last post ( New White Papers Available ), there are some great white papers available on MSDN to get you going with MDS. Below you’ll find more information about other updated and newly published content....(read more)

    Read the article

  • What will be the better way for data retrieval on application that needs to handle limited amount of data.?

    - by Milanix
    This is not really a coding question since, I am not adding any code in here. Since, adding my code snippets itself would make this question really long. Instead, I am pretty interested in knowing a better ways for data retrieval on application that needs to handle limited amount of data which isn't updated regularly. Let's take this example: I am writing an application which gets a schedule as an XML from server. I have written a logic in order to parse XML version and update database only if the version is newer than the local version. Although the update is checked automatically/manually on daily basis based on user preference, the actual version update happens only once per few months or so. Since, this is done by some other authority which doesn't provide API but, rather inform publicly on their changes. The actual XML contains a "(n number of groups)(days in a week) (n number of schedule)" . The group is usually 6 and the number of schedule is usually 2. So basically there would usually be only around 100 strings. Now although I have used SQLite at the moment. I want to know how to make update on database. Should I show progress dialog that the application is updating and exit the app when it's done? Since, my updates are infrequent i don't think this will really harm user experience but, is there any better ways to do it? Because I don't want update to be made when user is searching which is done using database. This will cause an database already open exception. Atleast I have faced this problem before. Is it better to rather parse XML every time when user wants to view certain things or to use SQLite? Since, I make lots of use of adapter in my app to create lists, will that degrade the performance? It would really be a great help if anyone can give me better overview about it. Or may be counter argument against each. Many thanks!

    Read the article

  • How to calculate required switch speed based on network usage?

    - by tobefound
    I have a 48 port HP Procurve Switch 2610 (J9088A) that can handle 13.0 million PPS (packets per second) and features wire speed switching capacity at 17.6Gbps. First off, what does that REALLY mean? Where do I start when trying to figure out if my office (with 70 employees) will be well setup with this switch? How to calculate through-put based on a user average load of X MB per day? 90% of the folks will only be sending email, access random websites, etc... the other 10% will be conducting heavier tasks like moving image files (10 MB) across network shares, constant external FTP streams through the switch to a server etc... Is this switch good enough?

    Read the article

  • CTERA Adds Data Protection to Linux File Systems

    <b>Enterprise Storage Forum: </b>"CTERA Networks is giving the Linux Ext3 file system additional data protection in the form of new snapshot capabilities. The file system is also the basis of the company's Cloud-Attached Storage appliances, the C200 and CloudPlug."

    Read the article

  • historical weather data APIs

    - by AJ.
    I am building a web application where I need to display whole year's month wise weather conditions. So that users get an idea of what the weather conditions are like and plan their trips accordingly. I am using WunderGround's History feature but it does not give this data for smaller towns and destinations, even some very popular tourist destinations. Are there any alternatives which could provide me the same information.

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week in green data center management: Equinix was granted LEED certification for its 2009 retrofit of its Silicon Valley SV2 International Business Exchange facility, Neuwing Energy Ventures announced it successfully registered the first voluntary Energy Efficiency Certificates in the newly launched APX North American Renewables Registry, and more.

    Read the article

  • Upgrading Data Tier Applications in SQL Server 2008 R2

    Changes are inevitable and like many other things in life your application will change over time. The question is how to upgrade an already deployed Data Tier Application to a newer version; what are the different methods available for upgrade and what considerations should you take? Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

  • New Master Data Services Training Available

    - by mattande
    [posted by Suzanne Selhorn, Technical Writer on the MDS team] Some new self-paced training is now available on the Microsoft Download Center. To take advantage of this training, you should have a working installation of MDS with sample data already loaded. 01 Introduction http://download.microsoft.com/download/5/9/F/59F1639E-EF57-4915-8848-EF1DC2157EBB/01 Introduction.pdf This lesson provides an overview of MDS. 02 MDS Environment http://download.microsoft.com/download/5/9/F/59F1639E-EF57-4915-8848-EF1DC2157EBB/02...(read more)

    Read the article

  • SSIS 2008 Import and Export Wizard and Excel-based Data

    Even though the Import and Export Wizard, incorporated into the SQL Server 2008 platform, greatly simplifies the creation of SQL Server Integration Services packages, it has its limitations. This article points out the primary challenges associated with using it to copy data between SQL Server 2008 and Excel and presents methods of addressing these challenges.

    Read the article

  • Database Deployment: The Bits - Copying Data Out

    Occasionally, when deploying a database, you need to copy data out to file from all the tables in a database. Phil Factor shows how to do it, and illustrates its use by copying an entire database from one server to another. SQL Backup Pro wins Gold Community Choice AwardFind out why the SQL Server Community voted SQL Backup Pro 'Best Backup and Recovery Product 2012'. Get faster, smaller, fully verified backups. Download a free trial now.

    Read the article

  • Using an Internal HDD as an External HDD also or an External HDD for installing SAP ?

    - by Asterix
    Is it possible and advisable to use an Internal Hard Drive as an External Hard drive also. I wanted to install SAP ECC 6 on my system which has only 250 GB but atleast 300 GB is required.I wanted to buy an External Drive first, then I heard loading SAP on an External would make it extremely slow. I'll be using it only as a beginer so even if it is a little slow i don't mind. Is it feasible to run such a big application from an External Hard disk ? So can i purchase a 500 or 1 TB Internal Hard disk and use it as an External too by fitting it with the necessary USB 3.0 Hard drive cases and cables ? or should i purchase a External and load SAP onto it ? Thank you.

    Read the article

  • Adding multiple data importers support to web applications

    - by DigiMortal
    I’m building web application for customer and there is requirement that users must be able to import data in different formats. Today we will support XLSX and ODF as import formats and some other formats are waiting. I wanted to be able to add new importers on the fly so I don’t have to deploy web application again when I add new importer or change some existing one. In this posting I will show you how to build generic importers support to your web application. Importer interface All importers we use must have something in common so we can easily detect them. To keep things simple I will use interface here. public interface IMyImporter {     string[] SupportedFileExtensions { get; }     ImportResult Import(Stream fileStream, string fileExtension); } Our interface has the following members: SupportedFileExtensions – string array of file extensions that importer supports. This property helps us find out what import formats are available and which importer to use with given format. Import – method that does the actual importing work. Besides file we give in as stream we also give file extension so importer can decide how to handle the file. It is enough to get started. When building real importers I am sure you will switch over to abstract base class. Importer class Here is sample importer that imports data from Excel and Word documents. Importer class with no implementation details looks like this: public class MyOpenXmlImporter : IMyImporter {     public string[] SupportedFileExtensions     {         get { return new[] { "xlsx", "docx" }; }     }     public ImportResult Import(Stream fileStream, string extension)     {         // ...     } } Finding supported import formats in web application Now we have importers created and it’s time to add them to web application. Usually we have one page or ASP.NET MVC controller where we need importers. To this page or controller we add the following method that uses reflection to find all classes that implement our IMyImporter interface. private static string[] GetImporterFileExtensions() {     var types = from a in AppDomain.CurrentDomain.GetAssemblies()                 from t in a.GetTypes()                 where t.GetInterfaces().Contains(typeof(IMyImporter))                 select t;       var extensions = new Collection<string>();     foreach (var type in types)     {         var instance = (IMyImporter)type.InvokeMember(null,                        BindingFlags.CreateInstance, null, null, null);           foreach (var extension in instance.SupportedFileExtensions)         {             if (extensions.Contains(extension))                 continue;               extensions.Add(extension);         }     }       return extensions.ToArray(); } This code doesn’t look nice and is far from optimal but it works for us now. It is possible to improve performance of web application if we cache extensions and their corresponding types to some static dictionary. We have to fill it only once because our application is restarted when something changes in bin folder. Finding importer by extension When user uploads file we need to detect the extension of file and find the importer that supports given extension. We add another method to our page or controller that uses reflection to return us importer instance or null if extension is not supported. private static IMyImporter GetImporterForExtension(string extensionToFind) {     var types = from a in AppDomain.CurrentDomain.GetAssemblies()                 from t in a.GetTypes()                 where t.GetInterfaces().Contains(typeof(IMyImporter))                 select t;     foreach (var type in types)     {         var instance = (IMyImporter)type.InvokeMember(null,                        BindingFlags.CreateInstance, null, null, null);           if (instance.SupportedFileExtensions.Contains(extensionToFind))         {             return instance;         }     }       return null; } Here is example ASP.NET MVC controller action that accepts uploaded file, finds importer that can handle file and imports data. Again, this is sample code I kept minimal to better illustrate how things work. public ActionResult Import(MyImporterModel model) {     var file = Request.Files[0];     var extension = Path.GetExtension(file.FileName).ToLower();     var importer = GetImporterForExtension(extension.Substring(1));     var result = importer.Import(file.InputStream, extension);     if (result.Errors.Count > 0)     {         foreach (var error in result.Errors)             ModelState.AddModelError("file", error);           return Import();     }     return RedirectToAction("Index"); } Conclusion That’s it. Using couple of ugly methods and one simple interface we were able to add importers support to our web application. Example code here is not perfect but it works. It is possible to cache mappings between file extensions and importer types to some static variable because changing of these mappings means that something is changed in bin folder of web application and web application is restarted in this case anyway.

    Read the article

  • High CPU Steal percentage on Amazon EC2 Instance

    - by Aditya Patawari
    I am experiencing high CPU steal percentage in a Amazon EC2 large instance. I know it means that my virtual CPU is waiting on the real CPU of the machine for time. My question is that what can I do to reduce this percentage and get maximum out of the CPU? Steal percentage is consistently at 20%. System load crosses 10 when this happens. I have checked memory and network and I am sure that they are not the bottleneck. Is that normal for such environment? Also are there any system level optimization techniques for reducing steal percentage form the virtual instance? avg-cpu: %user %nice %system %iowait %steal %idle 52.38 0.00 8.23 0.00 21.21 18.18

    Read the article

  • How to choose NoSQL database engine?

    - by Poma
    We have a database with following specs: 30k records, 7mb in size 20 inserts/second 1000 updates/second 1000 range selects/second, by secondary index, approx 10 rows each needs at least one secondary index needs some mechanism to expire keys if they are not updated for 75 secs (can be done via programmatic garbage collector but will require additional 'last_update' index and will add some load) consistency is not required durability is not required db should be stored in memory For now we use Redis, but it does not have secondary index and it's keys index:foo:* is too slow. Membase also does not have secondary index (as far as I know). MongoDB and MySQL memory engine have table-level locks. What engine will fit our use case?

    Read the article

  • glTexImage2D not loading my data

    - by Clyde
    Can anyone suggest why this code doesn't work? When I draw using this texture all I get is black. If I use GLUtils.texImage2D() to load a png file, it works correctly. ByteBuffer bb = ByteBuffer.allocateDirect(128*128*4).order(ByteOrder.nativeOrder()); bb.position(0); for(int row = 0; row != 128; row++) { for(int i = 0 ; i != 128 ; i++) { bb.put((byte)0x80); bb.put((byte)0xFF); bb.put((byte)0xFF); bb.put((byte)i); } } int[] handle = new int[1]; GLES20.glEnable(GLES20.GL_TEXTURE_2D); GLES20.glGenTextures(1, handle, 0); DrawAdapter.checkGlError("Gen textures"); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, handle[0]); DrawAdapter.checkGlError("Bind textures"); bb.position(0); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, 128, 128, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, bb); DrawAdapter.checkGlError("glTexImage2D"); return handle[0];

    Read the article

< Previous Page | 263 264 265 266 267 268 269 270 271 272 273 274  | Next Page >