Search Results

Search found 1466 results on 59 pages for 'backing beans'.

Page 37/59 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • Is (Ubuntu) Linux file copying algorithm better than Windows 7?

    - by Sarath
    Windows Copying is a real mess ever since Windows Vista. Even Microsoft claims they've improved the performance, from a user perspective, it's not quite visible. Even with single file the copying window appears too much time for 'Calculating' and then finishing the copy(Even after 100% completion some times the dialog remains active). At the same time, I was backing up some files in Ubuntu Linux. I felt it's really fast. Might be a feeling caused by faster UI updates. I read an informative post from Jeff Atwood few years back on Windows File Copying. but what my specific questions are Is (Ubuntu) Linux file performance is better than Windows-7? Are both algorithms, Windows and Linux is making use of multiple threads and pipelining mechanism to improve the speed? If yes, which one is better?

    Read the article

  • Outlook Express hangs when selecting multiple emails

    - by Javier Badia
    I'm using Outlook Express (6, I think) on Windows XP. Lately, it has been hanging. Sometimes this happens at startup (right after the main window with all the panes loads) and sometimes when selecting many emails (sometimes as low as three emails at once, sometimes at ten, it's not a fixed number). When this happens, msimn.exe starts to use 98-100% CPU and RAM usage shoots up very quickly, reaching hundreds of megabytes in half a minute. The message pane goes gray instead of showing the message contents. As I said, this sometimes happens right after the main window loads, sometimes when selecting many emails at once. I tried backing up everything, deleting the identities, creating a new one and restoring, but this still happens.

    Read the article

  • backuppc - how to backup remote (over the internet) clients?

    - by Scott
    I am testing out backuppc, which works great so far backing up windows clients on a LAN via SMB (no backup client/agent required). However I have quite a few laptops and desktops that are in various remote locations - some of which move around. I need some way to have that remote computer create an outgoing connection for backup purposes (Windows XP/7). I know backuppc supports smb, rsync and 'tar', but I believe these are all connections going from the server TO the client. SO, I either need a way to vpn the client on a timed basis, or it would be a lot better if the client could some how connect to the server (ssh?) and initiate it's own backup somehow (rsync?). Of course this all needs to be pre-installed by me and require no maintenance by the end user, no dialogs on their side. What do you think?

    Read the article

  • Using Windows Azure storage for backup

    - by Bruno
    I am currently looking at Windows Azure blobs as an option for backing up archive data. I want to be able to upload files from an external windows machine via the internet but I don't know enough about Windows Azure storage to make a decision. Some of the questions I have are How do I upload the files. Is there a client application, can I use robocopy? Would it be fast enough? i.e. Could I download or upload 1TB of data in a week? Is it secure? Hopefully someone smarter than me can help me :-)

    Read the article

  • VM Build XML file fails to validate against OVF 1.0 schema

    - by siddharthgod
    For our product, we were trying to generate VM / vApp build XML from java code. For this purpose, we were using XML Beans. When we tried to generate JAVA classes for OVF envelope for 0.9 (ovf-envelope.xsd in schemas/ovf) it was successful. However these schemas does not allow us to add IPassignment section which is available in OVF 1.0. When we tried to compile 1.0 schema (ovfenv-vmware.xsd in schemas/ovf1.0.0e/vmware folder), we get validation errors. When we loaded this schema in schema editor we could see some validation errors. First error we saw was following: When we loaded ovfenv-vmware.xsd in XMLspy we could see following validation error in dsp8027.xsd - "cos-nonambig: makes the content model non-deterministic against . Possible causes: name equality, overlapping occurrence or substitution groups." Same error was also thrown by xmlbean while generating java classes from ovfenv-vmware.xsd. Is there any workaround for this problem?

    Read the article

  • Welcome to the Java Training Beat!

    - by tmcginn
    We are a group of dedicated training developers for Java, located in the US, India, and now Mexico. In this blog we will announce new training content and events that might be of interest to our readers. In this first installment of the Java Training Beat, I would like to introduce three new Oracle By Example (OBE) modules I recently released and posted to the Oracle Online Learning Library. Creating a Simple Java Message Service (JMS) Producer with NetBeans and GlassFish - covers how to create a simple text message producer with NetBeans 7 and GlassFish. Creating Java Message Service (JMS) Resources in WebLogic Server 12c - covers how to create JMS resources using the console and WebLogic Server 12c. With this tutorial, you can replicate the results of the first tutorial in WebLogic. Creating a Publish/Subscribe Model with Message-Driven Beans and GlassFish Server - covers how to create a publish/subscribe application using JMS. This tutorial includes a short case study that includes a JSF front-end application that sends a hotel reservation request object to the server as a MapMessage. Hope you find these useful!  And do check out the Online Learning Library - we have a wide range of additional content posted and more being added every month!

    Read the article

  • Can I upgrade my ubuntu version and change to be primary OS after originally installing with Wibu?

    - by Garrick Wann
    I have recently installed Ubuntu 12.04 using Wubi 12.04 and I now wish to upgrade to a full installation of Ubuntu 14.04, Before attempting to upgrade through the update center I did some research on upgrading from a Wubi installation (alongside windows) to a full installation making Ubuntu primary and only OS and found that it is in fact doable through the update center however it is just highly recommended to perform a full backup before doing so. I have now finished backing up all the data I need to worry about and began the upgrade process through the update center and received the following error: Your graphics hardware may not be fully supported in Ubuntu 14.04. Running the 'unity' desktop environment is not fully supported by your graphics hardware. You will maybe end up in a very slow environment after the upgrade. Our advice is to keep the LTS version for now. For more information see https://wiki.ubuntu.com/X/Bugs/UpdateManagerWarningForUnity3D Do you still want to continue with the upgrade? My questions are as follows: A. Isnt 14.04 a LTS version??? B, What are your recomendations in order to ensure my graphics driver is installed correctly and im not stuck with bad configs/install?

    Read the article

  • Multiple .bkf files created in Backupexec 12.5 or 2010 related to heavy I/O?

    - by syuusuke
    Hey everyone, I was wondering if anyone who has used backupexec 12.5 or 2010 have ever experienced multiple .bkf files created for a single job. To describe what I mean by multiple files, the .bkf are being created with random file sizes under 2GB even though I've assigned the setting to chop the file after 10GB size. Some jobs will create 20x .bkf files in 1 job with file chunks ranging from 50MB to 800MB sizes. Is this is a sign of heavy I/O issues? Bandwidth limitations? I'm not sure, I'm here to seek some advices and suggestions. I've setup another backup server with the same exact settings and they seem to create a new .bkf file when 10GB limit has been reached. Although I am backing up different machines but I know my settings are an exact match to the problematic or atleast I think it's a problem.

    Read the article

  • Offsite Backup

    - by Grant Fritchey
    There was a recent weather event in the United States that seriously impacted our power grid and our physical well being. Lots of businesses found that they couldn’t get to their building or that their building was gone. Many of them got to do a full test of their disaster recovery processes. A big part of DR is having the ability to get yourself back online in a different location. Now, most of us are not going to be paying for multiple sites, but, we need the ability to move to one if needed. The best thing you can to start to set this up is have an off-site backup. Want an easy way to automate that? I mean, yeah, you can go to tape or to a portable drive (much more likely these days) and then carry that home, but we’ve all got access to offsite storage these days, SkyDrive, DropBox, S3, etc. How about just backing up to there? I agree. Great idea. That’s why Red Gate is setting up some methods around it. Want to take part in the early access program? Go here and try it out.

    Read the article

  • Is it a good idea to take onsite/offsite backups of server images?

    - by ServerAdminGuy45
    Assuming a non-virtualized environment it a good idea to take actual images of servers (using something like Acronis True Image) and store them on\off site? Backing up data is great but I feel it would be good to have copies of OS images in the event hardware dies or an upgrade gets botched I can always revert back. What would be your recommended way to do this (preferably using a NAS and an online backup service)? I was talking with the Iron Mountain folks and the service they described is more geared toward taking incremental snapshots of data. I'm not sure if there's a way to backup images in an incremental way such that only the changes between them are saved (that way I'm not wasting X GB each time I take an image).

    Read the article

  • Windows 2008 Best Raid Configuration

    - by Brandon Wilson
    I have 4 2TB hard drives and I was thinking about using Raid 10. This would give me 4TB correct? My next question is would it be easy to add more hard drives to the raid array. For example if I bought another hard drive can I add it to the array without backing up any data? Basically I want to be able to start off with 4TB and when the space becomes full add more space as needed. If this isn't possible with Raid 10, is it possible with any Raid configuration. Any suggestions would be appreciated. Thank you.

    Read the article

  • Process that needs a volume starting before volume mounts

    - by user36126
    The destination for incoming CrashPlan backups on my server (11.04) is /media/SeagateBig (SeagateBig is the volume name of my 2TB USB drive). When the server boots, two things happen: 1) SeagateBig auto-mounts and 2) CrashPlan starts. The problem is, that often these two things don't happen in that order. Then I get: Crashplan starts looks for /media/SeagateBig doesn't find it instead of waiting for it, CREATES IT Now it's backing up onto my / filesystem. NOT COOL. Meanwhile, when SeagateBig finally gets around to mounting, it finds that /media/SeagateBig already exists, shrugs, and creates /media/SeagateBig_ as its mount point. What I need is a way for the order to be enforced - where SeagateBig mounts and then and only then the CrashPlan service is started. Unless I learn that CrashPlan can be told to wait for its destination directory, never to create it... which I am also investigating. But the CrashPlanEngine script is installed by the product so I am loath to modify it, as I know I could by having it loop until df greps successfully for "SeagateBig".

    Read the article

  • Is there a way to schedule an automatic WinClone run on my Bootcamp partition?

    - by user17873
    Hi, I've currently got time machine setup to back up my entire OS X installation. I also have a backup tool within my Bootcamp Windows 7 installation which automatically backs up my windows profile data to an external drive partition. Finally, I'm also backing up my Bootcamp partition weekly and storing on an external drive using WinClone. The final piece I need to complete my external backup process is to have the WinClone application backup my Bootcamp partition automatically once a week rather than having to call it manually and remember. Is this possible?

    Read the article

  • Some strange things in the db table of mysql database

    - by 0al0
    I noticed some weird things in the db table of mysql database in a client's server, after having the Mysql service stopping for no reason what are the test, and test_% entries? Why are there two entries for the database AQUA? Why is there a entry with a blank name? Should I worry about any of those? What should I do for each specific case? Is it safe to just delete the ones that should not be there, after backing up?

    Read the article

  • HAProxy overload protection

    - by user2050516
    using the HAProxy, would it be possible to configure an overload protection, to limit the amount of requests sent to the backing http server(s) to a given rate (z.B 100 Request per second ). If the threshold is exceeded requests should be answered with a default response. I am interested in requests per second not connections per second as a connection can have many requests. And yes to improve the servers is not an option here. If yes a configuration example to achieve that would be excellent. Thank you in advance.

    Read the article

  • What does the [0/0] indicator mean when entering copy mode in tmux?

    - by bps
    When entering copy mode in tmux, an indicator in the upper right corner shows "[0/0]". I can't find any documentation in the man page about what these numbers mean, and it's difficult to search since Google throws away the brackets and slash. This is generated by window_copy_write_line() in window-copy.c: if (py == 0) { size = xsnprintf(hdr, sizeof hdr, "[%u/%u]", data->oy, screen_hsize(data->backing)); if (size > screen_size_x(s)) size = screen_size_x(s); screen_write_cursormove(ctx, screen_size_x(s) - size, 0); screen_write_puts(ctx, &gc, "%s", hdr); but the variable names aren't very instructive to someone who isn't familiar with the code. Any hints as to what these numbers mean?

    Read the article

  • Backup folder on sometimes attached external usb harddrive

    - by ctrler
    My girlfriend no longer has space her laptops drive to store her photos. The drive she has now is 750GB, so to go to a bigger drive would be expensive, as there isn't many 1.5tb 2.5 inch 9.5mm hdd on the market (as of now, there is only one). Because of that, I am thinking of moving her pictures to a cheap external usb hdd. As of now, I'm automatically backing up her important folders (My Documents, Pictures, etc.) using Windows 7 default backup software to a network drive. My problem is that I don't know of a good solution to automatically backup a folder residing on an usb disk. The usb disk won't be attached to the computer all the time, so I can't just treat it as a normal backup folder. Sometimes the backup would run and the folder would not be there! Anyone knows any software or methodology to backup folders on external usb hard drives that are not always present? Thanks

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • How to backup old emails locally in Thunderbird and then remove them from IMAP server?

    - by saicode
    I am using Godaddy IMAP email with Thunderbird as my desktop client on Windows 7. The email service has unlimited mailbox size but the local Thunderbird is having troubles due to the large size of the inbox/outbox. I would like to take out old emails from IMAP server and backup them locally. After backing up the old email I would like to delete old email (older than let's say, 2012) from the server. Also I'd like to have them accessible from the local backup if ever needed in the future. This way I might be able to make Thunderbird fast and problem free. Problem is, I am not able to find any instruction to do this in an automated way based on dates etc. I can find some links for Archiving, Compacting and Backup. But unable to find any tutorial about how to backup and archive it locally and delete the original emails from the server.

    Read the article

  • Why hasn't anyone made a way for TimeMachine to wirelessly backup to Amazon S3?

    - by Jordan
    Seriously. I'm looking at you, Apple. If TimeMachine is supposed to be 'simple backup that just works' why is it impossible to backup into my S3 filespace? Why hasn't some 3rd party developer (JungleDrive???) made it so that TimeMachine will be OK with backing up to amazon s3 storage? It just seems like the most convenient, robust answer. I'd gladly pay the $20-25 a month for complete, unlosable backups that I can sync with wirelessly on a proper scheme.

    Read the article

  • Do you have a data roadmap?

    - by BuckWoody
    I often visit companies where they asked me “What is SQL Server’s Roadmap?” What they mean is that they want to know where Microsoft is going with our database products. I explain that we’re expanding not only the capacities in SQL Server but the capabilities – we’re trying to make an “information platform”, rather than just a data store. But it’s interesting when I ask the same question back. “What is your data roadmap?” Most folks are surprised by the question, thinking only about storage and archival. To them, data is data. Ah, not so. Your data is one of the most valuable, if not the most valuable asset in your organization. And you should be thinking about how you’ll acquire it, how it will be distributed, how you’ll archive it (which includes more than just backing it up) and most importantly, how you’ll leverage it. Because it’s only when data becomes information that it is truly useful. to be sure, the folks on the web that collect lots of data have a strategy for it – do you? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Kill all currently running cron jobs

    - by Adelphia
    For some reason my cron job scripts aren't exiting cleanly and they're backing up my server. There are currently a couple hundred processes running for one of my users. I can use the following command to kill all processes by that user, but how can I simplify this to kill only crons? pgrep -U username | while read id ; do kill -6 $id ; done It would be dangerous to run the above command as is, correct? Wouldn't that kill mysql and other important things?

    Read the article

  • Access to certain files but not others

    - by ADW
    Hoping someone can help me as I have, thus far, been unable to solve the issue. I am running a media center utilizing Ubuntu 12.04. I was initially successful accessing media files from the desktop running Ubuntu via my Windows 7 laptop and Roku device. I started backing up a new batch of DVD's I had (into MKV files, like everything else in my media folders) and noticed I cannot access the new files from either the Roku or the laptop. I have not changed any settings in the media folder and verified the shared permissions. The parent folder (Media) is shared (with permission flow-down) while the subfolders (Movies, TV Shows, Music) are not. I have changed the permissions on this to include shared when the access problem arose but with no success. I can only access the original files uploaded an not new files added. Any suggestions??? Thanks in advance for any and all help.

    Read the article

  • Why my backup to USB Drive is too slow?

    - by Jonas
    I have tried several backup solutions for my data and none of them was good enough. I basically want to make a copy of my files to an attached USB Drive from time to time. I don't mind starting my backups manually, since the USB Drive is not always connected. My problem is that my data contains a lot of files (a huge amount), so backing up takes forever (more than 20 hours). Using "rsync" an other similar solutions is not working because the I/O needed to check the file for changes takes longer than the time to actually copy it. Any suggestions?

    Read the article

  • protobuf-net: incorrect wire-type exception deserializing Guid properties

    - by Paul Smith
    I'm having issues deserializing certain Guid properties of ORM-generated entities using protobuf-net. Here's a simplified example of the code (reproduces most elements of the scenario, but doesn't reproduce the behavior; I can't expose our internal entities, so I'm looking for clues to account for the exception). Say I have a class, Account with an AccountID read-only guid, and an AccountName read-write string. I serialize & immediately deserialize a clone. Deserializing throws an Incorrect wire-type deserializing Guid exception while deserializing. Here's example usage... Account acct = new Account() { AccountName = "Bob's Checking" }; Debug.WriteLine(acct.AccountID.ToString()); using (MemoryStream ms = new MemoryStream()) { ProtoBuf.Serializer.Serialize<Account>(ms, acct); Debug.WriteLine(Encoding.UTF8.GetString(ms.GetBuffer())); ms.Position = 0; Account clone = ProtoBuf.Serializer.Deserialize<Account>(ms); Debug.WriteLine(clone.AccountID.ToString()); } And here's an example ORM'd class (simplified, but demonstrates the relevant semantics I can think of). Uses a shell game to deserialize read-only properties by exposing the backing field ("can't write" essentially becomes "shouldn't write," but we can scan code for instances of assigning to these fields, so the hack works for our purposes). Again, this does not reproduce the exception behavior; I'm looking for clues as to what could: [DataContract()] [Serializable()] public partial class Account { public Account() { _accountID = Guid.NewGuid(); } [XmlAttribute("AccountID")] [DataMember(Name = "AccountID", Order = 1)] public Guid _accountID; /// <summary> /// A read-only property; XML, JSON and DataContract serializers all seem /// to correctly recognize the public backing field when deserializing: /// </summary> [IgnoreDataMember] [XmlIgnore] public Guid AccountID { get { return this._accountID; } } [IgnoreDataMember] protected string _accountName; [DataMember(Name = "AccountName", Order = 2)] [XmlAttribute] public string AccountName { get { return this._accountName; } set { this._accountName = value; } } } XML, JSON and DataContract serializers all seem to serialize / deserialize these object graphs just fine, so the attribute arrangement basically works. I've tried protobuf-net with lists vs. single instances, different prefix styles, etc., but still always get the 'incorrect wire-type ... Guid' exception when deserializing. So the specific questions is, is there any known explanation / workaround for this? I'm at a loss trying to trace what circumstances (in the real code but not the example) could be causing it. We hope not to have to create a protobuf dependency directly in the entity layer; if that's the case, we'll probably create proxy DTO entities with all public properties having protobuf attributes. (This is a subjective issue I have with all declarative serialization models; it's a ubiquitous pattern & I understand why it arose, but IMO, if we can put a man on the moon, then "normal" should be to have objects and serialization contracts decoupled. ;-) ) Thanks!

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >