Search Results

Search found 6875 results on 275 pages for 'crash reports'.

Page 173/275 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • df -h overreports disk space on VPS

    - by Rincewind42
    When I run the command df -h on my new Ubuntu linux vServer I get the following: # df -h Filesystem Size Used Avail Use% Mounted on /dev/hdv1 466G 33G 434G 7% / none 16M 0 16M 0% /tmp Running du -sh gives # du -sh du: cannot access `./proc/13624/task/13624/fd/4': No such file or directory du: cannot access `./proc/13624/task/13624/fdinfo/4': No such file or directory du: cannot access `./proc/13624/fd/4': No such file or directory du: cannot access `./proc/13624/fdinfo/4': No such file or directory 952M . The VPS should only have 5Gb of disk space but df reports 466Gb. How can I view the correct amount of disk space?

    Read the article

  • Protect value from changies using reflection?

    - by IordanTanev
    Hi, here is the problem case i am writing a little third party library. In this library i have a class like this public class TestClass { public int TestField { get; private set; } public TestClass( ) { TestField = 1; } } Then i have a varialbe form this class like this public TestClass test = new TestClass( ); The problem i am facing is that usnig reflection like this PropertyInfo field = typeof( TestClass ).GetProperty( "TestField" ); field.SetValue( test, 2, null ); programers can change internal value of this class. this will be very bad thing becouse it can crash the hole library. My question is what is the best way to protect my code form such changes.I know i can use some kind of bool flag so tha value can be changed only ones but this is not very good salution is there a better one? Best Regards, Iordan

    Read the article

  • Populating a WPF listbox with items from an SQL (SDF) database

    - by xplinux557
    I have been searching on how to do this for a very long time, and I have not managed to get a straight answer on the subject, so hopefully one of you StackOverflow users will be able to help me here. I have a WPF ListBox named CategoryList and a SDF database called ProgramsList.sdf (with two tables called CategoryList and ProgramsList). What I wish my program to do is get the category names from the CategoryList table and list them in the ListBox control called CategoryList. Here's the code that I tried, but it only caused my program to crash. SqlConnection myConnection = new SqlConnection("Data Source=" + AppDomain.CurrentDomain.BaseDirectory + "ProgramsList.sdf"); SqlDataReader myReader = null; myConnection.Open(); CategoryList.Items.Clear(); SqlDataReader dr = new SqlCommand("SELECT Name FROM CategoryList ORDER BY Name DESC", myConnection).ExecuteReader(); while (myReader.Read()) { CategoryList.Items.Add(dr.GetInt32(0)); } myConnection.Close(); Can anyone help me? Thanks in advance!

    Read the article

  • DataSet binding problem

    - by Shaine
    I've got in-memory dataset with some table defined and I populate this table in a following way: for(...) ds.Fields.AddFieldsRow(++j, 0, heading, "Char", "", "", "Input", 0, "","",""); On the GUI I've got DataGridView bound to that table inside TabControl (bound through BindingSource). Very strange thing is happening: if I open tab pane with this grid and populate table with some data then I see changes in grid. On the other side if I'm at other tab, populate table, and then switch to tab with grid I've got following exception: "DataMember property 'Fields' cannot be found on the DataSource". In similar way I've got 2 tab panes with grid in each that are bound to the same datatable using different datasources and I open one of them, populate, see the changes, then switch to second tab and get crash. What am I missing?

    Read the article

  • 2 GB of memory in 1 GB system is a problem?

    - by daveslab
    Hi folks, I just installed 2 1 Gig sticks into my friend's machine, thinking that it would take all the 2 GBs. Unfortunately, according to Dell's website, it says the maximum amount of memory accessible to the machine is arbitrarily set to 1 GB! The system indeed reports having 1 GB of memory accessible to it, but I'm worried that having 2 GB in there might break something. Are my fears reasonable? Should I buy two 512 MB sticks instead? Thanks for any help!

    Read the article

  • HRESULT exception not caught in VS 2008

    - by arionik
    Hello all, I've got a stange situation in visual studio 2008 C++. I work on code that was originally written for visual studio 2003, where everything works well. Now, ported to VS 2008, the exception handling, which unfortuantely exists widely in the code, does not work anymore. standard code example: try { HRESULT hr = S_OK; // do stuff... if( FAILED( hr ) ) throw hr; } catch( HRESULT hr ) { // error handling, but we never get here } catch( ... ) { // ... not even here } Under VS 2008, no exception is encountered, but I get a crash somewhere else, indicating that the stack pointer must be screwed up. Did anybody come across this behaviour? Any help is appreciated.

    Read the article

  • Class design question (Disposable and singleton behavior)

    - by user137348
    The Repository class has singleton behavior and the _db implements the disposable pattern. As excepted the _db object gets disposed after the first call and because of the singleton behavior any other call of _db will crash. [ServiceBehavior(InstanceContextMode=InstanceContextMode.Single)] public class Repository : IRepository { private readonly DataBase _db; public Repository(DataBase db) { _db = db; } public int GetCount() { using(_db) { return _db.Menus.Count(); } } public Item GetItem(int id) { using(_db) { return _db.Menus.FirstOrDefault(x=>x.Id == id); } } } My question is, is there any way to design this class to work properly without removing the singleton behavior? The Repositoryclass will be serving big amount of requests.

    Read the article

  • How can I sync Access databases and keep them up-to-date?

    - by user327472
    I have an Access database on my server. We split it up and use the front-end database for search data and adding new records or reports in local computer. If we update or add a new record, that writes to the back-end of database. I want to use this database in the other building with other servers. Also, those servers have no direct connection. How can I sync both back-end databases to keep the database data up to date? These details may be useful: It's a big amount of data - about 25,750 client records. I guess there are more than 25 tables at 80 MB.

    Read the article

  • So where is this calling super?

    - by dontWatchMyProfile
    From the Core Data docs: Inheritance If you have two subclasses of NSManagedObject where the parent class implements a dynamic property and its subclass (the grandchild of NSManagedObject) overrides the methods for the property, those overrides cannot call super. @interface Parent : NSManagedObject @property(nonatomic, retain) NSString* parentString; @end @implementation Parent @dynamic parentString; @end @interface Child : Parent @end @implementation Child - (NSString *)parentString { // this throws a "selector not found" exception return parentString.foo; } @end very, very funny, because: I see nobody calling super. Or are they? Wait... parentString.foo results in ... a crash ??? it's a string. How can that thing have a .foo suffixed to it? Just another documentation bug?

    Read the article

  • Excel DataFlow UML Viewer/Navigator/Visualiser tool/ hint

    - by Arjang
    Not sure what to call it but, is there a birds eye view tool for excel to show the data flow between excel sheets/cels etc? I have inherited some huge reports and looking at each cell to see where it's data comes from or what sheet/cell dependencies it has is a nightmare. Or even just something with excel that show the dependencies within a sheet of cells to each other etc. Or Any other visualization tool that can show the data flow between cells ( I tried visio but it seemed it is only for making diagrams of data not the data model of excel itself ). Or at least if I am within a cell and see a formula referring to other sheets and cells, is there a quick way to navigate there and back? Like code navigation in VS? Thank you for your help

    Read the article

  • How can I debug a Perl program that suddenly exits?

    - by taw
    I have Perl program based on IO::Async, and it sometimes just exits after a few hours/days without printing any error message whatsoever. There's nothing in dmesg or /var/log either. STDOUT/STDERR are both autoflush(1) so data shouldn't be lost in buffers. It doesn't actually exit from IO::Async::Loop->loop_forever - print I put there just to make sure of that never gets triggered. Now one way would be to keep peppering the program with more and more prints and hope one of them gives me some clue. Is there better way to get information what was going on in a program that made it exit/silently crash?

    Read the article

  • CSS - Why am I not able to set height and width of <a href> elements?

    - by Kenny Bones
    Hi, I'm trying to create css buttons by using the following html markup: <a href="access.php" class="css_button_red">Forgot password</a> But it ends up being not bigger than the text in the middle. Even though I've set the class's height and width. You can preview the problem here btw, www.matkalenderen.no Notice the first button, that's a form button and it's using it's own class. At first I tried to use the same class on the css button as well and the same problem appeared, so I tried to separate them into their own classes. In case there was some kind of crash. But it didn't matter anyway. What am I missing here?

    Read the article

  • Is Private Bytes >> Working Set?

    - by Jacob
    OK, this may sound weird, but here goes. There are 2 computers, A (Pentium D) and B (Quad Core) with almost the same amount of RAM running Windows XP. If I run the same code on both computers, the allocated private bytes in A never goes down resulting in a crash later on. In B it looks like the private bytes is constantly deallocated and everything looks fine. In both computers, the working set is deallocated and allocated similarly. Could this be an issue with manifests or DLLs (system)? I'm clueless. Note: I observed the utilized memory with Process Explorer. Question: During execution (where we have several allocations and deallocations) is it normal for the number of private bytes to be much bigger (1.5 GB vs 70 MB) than the working set?

    Read the article

  • Cloned Windows 7 to new HDD and want to change the drive letter to C

    - by Hoppe
    I used Clonezilla to clone my existing hard drive to a new one I bought. I then changed the BIOS to set the new drive as the first in the boot sequence. I'm pretty sure that I'm still running Windows 7 on the old drive. My old drive is marked as C. Now that I don't have a disk drive any more, how I do I swap the drive letter from J: to C:? I tried to change it in the disk management section of "Manage", but it reports: "the parameter is incorrect".

    Read the article

  • tortoisesvn - Error REPORT request failed on ../../../!svn/vcc/default

    - by John
    Users attempting to check out files from a particular Subversion 1.4.x repository with Apache 2.2 on Windows 2003 have suddenly begun getting an error message in their log windows upon checkout with TortoiseSVN 1.4: Error REPORT request failed on '/[path_to_repo]/!svn/vcc/default' Error REPORT of '/[path_to_repo]/!svn/vcc/default': 200 OK (http://[server_name]) This started following an hd crash on the server and subsequent restore of about 10 subversion repositories. Only one repository is having this problem after an attempted working directory reconciliation. The repo owner reconcile their working directory with the repository by modifying/deleting the hidden .svn directories (though this was not advised). I can't find anything on the Internets that represents my situation. The restored server is exactly as the original and no other repositories on this server are throwing errors. Any ideas on 1) what this error is and 2) how to fix it?

    Read the article

  • Problems with Finder background images produced in Mac OS X 10.6?

    - by Joe
    Morning all. I'm creating DMG installers with background pictures. My build machine is 10.6. I'm having problems getting them working consistently: If I create one on 10.4 it works fine in 10.5, 10.5 and 10.6, If I create one for 10.5 it works fine in 10.4, 10.5 and 10.6, But if I create one in 10.6, the background picture shows in 10.6 but does not show in 10.4 or 10.5. I think I recall having seen similar reports in one or two places but there's not much information on the web. Has anyone here come across this problem? Is it recognised? Fixable? Unfortunately I have no option of running 10.5 on my build machine... Thanks! Update: This is a confirmed known bug in 10.6 . I will update this if there is any extra news.

    Read the article

  • How do you create a report in Word (or other documentation software) that is linked directly with Ex

    - by NoCatharsis
    I believe my question may be best answered by using Access since that's more what it's made for. However, I don't have a license for Access here at work and trying to get one is pulling teeth. So I'm curious if there is any way to compile reports with data in an Excel 2007 sheet. The output can be .doc, .docx, .pdf - or anything else if there's a decent piece of free 3rd party software. This might be easiest solved by just creating another sheet in the same workbook and directly linking to the data I want to display in a report-esque format. But I wanted to see if SU could offer some more creative solutions.

    Read the article

  • Best data recovery tools?

    - by Nonick
    So due to a recent act of stupidity and bravado, I uttered the words "backups! who needs backups?!" and what followed was the tragic loss of 260gb of data. This scenario in particular is requiring me to recover a repartitioned hard disk, but I was wondering what tools people here use in general to recover lost data. I'm sure everyone has been there, either accidentally rewriting files, resaving an old version, computer crash, hard disk death, user deletes an important document etc. So was thinking it might be an interesting point of discussion as to what you guys use to recover lost data. I appologise if this is considered irrelevant, but considering there has been a few recovery questions, I think this might be interesting.

    Read the article

  • Wired and Wireless Network Duplication

    - by Dave
    Howdy! Running into an issue when some of our client's have their laptop's connected via the wired ethernet network aswell as on the WLAN of the same network. There is know issues caused to the end clients.. BUT! Being a Managed Services Engineer i get pretty over the alerts that come through on our reports for machine's with the same hostname on the same network! We are not going to remove this monitoring because it does help a lot with detecting and stopping inferior users and things like that. So basically.. Question is, is there a way in Windows (third party programs welcome) to disable the wireless network when a wired network is connected and operational.. I know that Windows automticaly 'prefers' the wired network, however they are still both connected and therfore there is duplicate hostnames on the same network. Could also have stupid issues with DNS and things like that! Thanks!

    Read the article

  • autosave pattern

    - by Mark
    I'm using localstorage to do gmail-style autosave on a webpage. So I basically save every 30 seconds to local, OK. The problem is recovery. I can't detect whether or not a user has crashed or incorrectly exited. So let's say the user crashed and loads up the form again, I can't just continue saving and overwriting the previous autosaves. I need to restore the previous save. But let's say the user didn't crash. He did everything correctly, but then used a different browser to edit the same file, so no new data to the previous browser's localstorage. He then loads up the file in the previous browser. The localstorage should not be restored in that case. Assuming there's no way to compare timestamps, how can I solve this problem? Thanks.

    Read the article

  • how to construct a long string

    - by david
    I need to construct a long string with javascript. Thats how i tried to do it: var html = '<div style="balbalblaba">&nbsp;</div>'; for(i = 1; i <= 400; i++){ html=+html; }; When i execute that in firefox its taking ages or makes it crash. what is the best way to do that? What is generally the best way to construct big strings in JS. can someone help me?

    Read the article

  • Why am I having trouble viewing HTTPS websites only using Chrome only on my employer's network?

    - by user1742777
    I'm using Google Chrome on my new MacBook Pro that has been provided to me by my employer. Many of the HTTPS sites I visit do not work when I visit them using Google Crome while I am connected to my employer's network. Example: www.facebook.com These same sites work perfectly fine if I use a different browser (like Safari) or even with Chrome when my Macbook is connected to my home WiFi network. Chrome reports the error: "The certificate was signed by an unknown authority". See attached screenshots. How can I resolve this problem? I really want to use Chrome. But not having access to numerous important work and outside websites is unacceptable.

    Read the article

  • Create a javascript chome extention that does not execute in 6 months

    - by user1907657
    I have just started learning programming and I would like to make a script into a chrome extension. Its a basic script and I hope to practice more and more and develop bigger projects and set myself bigger tasks This script has to do the following : reload a page every 20 seconds (say google.com) after 6 months the script must not run (maybe prompt a window saying "its over 6 months") The code should be able to go into a small chrome extension and also the 6 month time period should be absolute not relative to the time the script was started; for example should the browser crash and i have to turn on the extension again it should not restart the 6 month counter. Also if anyone could recommend any good sources for JavaScript to learn (preferably books; nothingIi read online ever seems to stick)

    Read the article

  • How to set Main Inerface in xCode programatically

    - by Tom Tallak Solbu
    I am using Apples MultipleDetailViews http://developer.apple.com/library/ios/#samplecode/MultipleDetailViews/Introduction/Intro.html for template as a source for my iPad app. The template is using a splitviewController in the interface builder (MainWindow). In the iPhone/iPad Deployment info of the target, MainWindow is set as the "Main Interface". I want my app to also run on iPhone. This means I need to load a different xib when the app is run on an iPhone. I must then remove "MainWindow" from "Main Interface" because the app will crash when I load on an iPhone due to the splitViewController. The AppDeleagte of the template look like this: - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { self.window = [[[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; self.window.rootViewController = self.splitViewController; [self.window makeKeyAndVisible]; } If I remove "Main Window" from "Main Interface", how do I then need to change the APpDelegate, or maybe I need to change the MainWindow.xib to also work for iPhone?

    Read the article

  • How long does it take in practice to warm up large in-memory databases?

    - by Sim
    Companies such as Peak Hosting are offering 64 core machines with 512Gb RAM for $2K/month. This is a very interesting choice for in-memory databases such as Memcached/Redis as well as databases whose performance degrades rapidly when the data & indexes don't fit in RAM, such as MongoDB. My main concern with monster machines such as these is the time it takes to warm up an in-memory database. In my experience, theoretical metrics, e.g., that SATA can load 100Mb/sec, fall short of what happens in practice. Even at that rate, 100Mb/sec means that loading up 512Gb RAM machine from SATA disks can take over 1 1/2 hours (!). I am looking for real-world reports of warm-up times for machines with very large memory. Please, share details of the software on the machine, data size, storage configuration, e.g., SATA or SSD, network, hosting/cloud provider, if relevant, etc.

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >