Search Results

Search found 11409 results on 457 pages for 'large teams'.

Page 242/457 | < Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >

  • Is there a way to undo deletion of registry keys while the machine is still running?

    - by Oliver Giesen
    [ also posted from a programmer's POV at http://stackoverflow.com/questions/3299230 ] I messed up big time and deleted a large portion of my registry during a programming experiment: As a result most of the contents of HKEY_CURRENT_USER\Software\ are gone. I haven't logged off or shutdown since this happened. The applications that were already running seem to be coping fine so far but I suspect that after the next reboot there won't be much happiness left... Also, System Restore tells me there are no restore points even though I'm pretty sure there should have been. Could this be another symptom of the purged registry? I wouldn't have expected this information to be stored under HKCU, though... Does anybody know of a technique or utility that can possibly restore some or all of the deleted entries? I'm on Windows 7 Enterprise 32bit. I'm not really holding my breath but you can always hope, can't you?

    Read the article

  • Is it possible to have a cell in table1 "point to" a cell in table2?

    - by Lewray
    I have a hierarchical structure in a database driven software application. Each row in parentTable 'owns' a number of rows in childTable. If the childTable does not have a value set in columnA then it should return the value specified in the appropriate row of parentTable columnB. Is it possible to implement a pointer or cell reference somehow so that I do not have to copy values from parent to child. (A change in parent could result in a large number of changes in child). If this is not possible, could anyone suggest a different approach?

    Read the article

  • What's faster, cp -R or unpacking tar.gz files?

    - by Buttle Butkus
    I have some tar.gz files that total many gigabytes on a CentOS system. Most of the tar.gz files are actually pretty small, but the ones with images are large. One is 7.7G, another is about 4G, and a couple around 1G. I have unpacked the files once already and now I want a second copy of all those files. I assumed that copying the unpacked files would be faster than re-unpacking them. But I started running cp -R about 10 minutes ago and so far less than 500M is copied. I feel certain that the unpacking process was faster. Am I right? And if so, why? It doesn't seem to make sense that unpacking would be faster than simply duplicating existing structures.

    Read the article

  • Can I recover files on a disk With 5% of start of disk completely wiped (overwritten with 1s)

    - by ARA
    Recently a virus attacked my pc and cleared 5% of my hard disk which has one partition I viewed the disk in a hex viewer program like active undelete ,cleared the virus data and overwrote it with 1s I want to recover a large file that is about 10gb, but no recovery tools seem to be able to recover any files. I want to know ,in theory, is this file recoverable ? I think that files are fragmented, researched about NTFS File System and i understand cluster information are just saved in MFT File ? Is there any way to recover file without a MFT structure ?

    Read the article

  • wmp12 refuses to convert files when syncing

    - by Carbonara
    I have quite a large music collection of MP3s at 320kbps and some WMA files at various bitrates. I'm trying to sync some of them to my HTC Desire and am quickly running out of space. WMP12 has options to set, per device you wish to sync, to auto convert to a lower bitrate whilst syncing. I have set this to auto convert files to a maximum bitrate of 192kbps, that way I can fit more music on the device but keep the files on my PC at the higher rate. See these screens to see that it's set up correctly. Only problem is, surprise surprise for a Microsoft product, it doesn't actually work. Any file that is greater than 192kbps, MP3 or WMA simply fails, doesn't get converted or copied to the device. The message in the sync log displays the rather unhelpful message "error" and that's it. Any help would be appreciated. I'm not really looking for alternative software solutions I'd like to get this working since that's what it's supposed to do.

    Read the article

  • JBoss 4.2.3 Won't Start

    - by Thody
    Hi, I'm trying to start a new installation of JBoss 4.2.3, and it's getting as far as "INFO [Server] Core system initialized", then hanging for several minutes. There is a Java process running, but only at ~35%. Also, looking at the boot.log, there are no entries after ~1s after starting the boot. Any ideas what might be up? Update: After about 10 minutes, I got a handful of garbage collection warnings: GC Warning: Repeated allocation of very large block (appr. size 512000): May lead to memory leak and poor performance.

    Read the article

  • How can I tell if a host is bridged and acting as a router

    - by makerofthings7
    I would like to scan my DMZ for hosts that are bridged between subnets and have routing enabled. Since I have everything from VMWare servers, to load balancers on the DMZ I'm unsure if every host is configured correctly. What IP, ICMP, or SNMP (etc) tricks can I use to poll the hosts and determine if the host is acting as a router? I'm assuming this test would presume I know the target IP, but in a large network with many subnets, I'd have to test many different combinations of networks and see if I get success. Here is one example (ping): For each IP in the DMZ, arp for the host MAC Send a ICMP reply message to that host directed at an online host on each subnet I think that there is a more optimal way to get the information, namely from within ICMP/IP itself, but I'm not sure what low level bits to look for. I would also be interested if it's possible to determine the "router" status without knowing the subnets that the host may be connected to. This would be useful to know when improving our security posture.

    Read the article

  • Limiting memory usage and mimimizing swap thrashing on Unix / Linux

    - by camelccc
    I have a few machines that I machine that I use for running large numbers of jobs where I try to limit the number of jobs so as not to exceed the available RAM of the machine. Occasionally I mis-estimate how much memory some of the jobs will take, and the machine starts thrashing the swap file. I resolve this by sending the kill -s STOP to one of the jobs so that it can get swapped out. Does anyone know of a utility that will monitor a server for processes by a specific name, and then pause the one with the smallest memory footprint is the total memory consumption reaches a desired threshold so that the larger ones can run and complete with a minimum of swap file thrashing? Paused processes then need to be resumed once some existing processes have completed.

    Read the article

  • Updating shared files across computers

    - by murgatroid99
    I have a file server running Windows Server 2008 and a couple of laptops running Windows 7 on a network. There are a large number of files that all users will need access to. My plan is to have the files on both the server and the laptops because the users will need to access the files in places with no Internet access. I also want any changes made to the files on any of the laptops to propagate to the server and then propagate to the other laptops whenever they connect to the network. Should I do this with a scheduled batch script with a few xcopy commands or is there a better way to do it?

    Read the article

  • How to apply a folder template to more than one selected folder in Vista?

    - by Albic
    I would like to apply a folder template to a number of folders. I selected the folders, opened the properties, selected the folder template (e. g. Music Details) and clicked OK. When I checked the folders I noticed that the template was only applied to the folder I performed the right click to open the properties on. The other folders remained untouched. I can't apply the template to the parent folder and use the "Also apply this template to all subfolders" option because the template should only be applied to specific folders and not all. Going over each folder in not an option because it's a large amount of folders. Is it possible to apply a folder template to more than one selected folder at a time?

    Read the article

  • Should I bother upgrading my Opteron 270 Server?

    - by MousePad
    I have an Opteron Server machine (in a large workstation class case) running on the Tyan 2895 motherboard. It's a dual CPU socket board, but I only have one 270 in there. I have 4GB of RAM, but less than 3GB is addressable, even in 64bit mode, due to the way the board is designed. Is it worth spending a few hundred on an additional CPU and maybe some more RAM? The other problem is that one of the two SATA ports on the board had its wire socket break off. So only one drive can be run as of now. I could have it repaired, but at what cost? Add in the fact that the power supply is gunked up with dust and it's a bit of a nightmare. I actually work about it getting too hot. Seems that for the money I could buy a new server rack from Dell, but it also seems a shame to waste an otherwise working, and for my needs still very fast machine.

    Read the article

  • Hosting solution for sensitive client data

    - by Mark
    Hello, We are developing a web application that will deal with highly sensitive (financial) data of clients (audience is medium to large sized businesses). Clients will be under scrutiny from regulators & auditors and, as such, we will be too. More importantly to give clients a level of comfort our application and related hosting arrangement should instill a lot of confidence with them. We are looking into using a cloud based service like Linode, Amazon EC2, etc. To allow for maximum flexibility We are keen on putting everything on virtual servers and avoiding having to buy our own hardware. Does a cloud based service make sense for our particular scenario? If not what type of hosting should we consider? If so what should we look out for? Thanks!

    Read the article

  • iTunes high CPU usage

    - by Calm Storm
    I upgraded to iTunes 10.4.1 and use Windows 7 and my itunes library is not that large at all (say about 20gb) When I start iTunes the CPU goes between 60-80% and stays there for a long time. I see that the itunes.exe takes about 70% of CPU in Process Explorer and it spawns a SearchProtocolHost.exe every 2 mins or so which takes < 0.1% CPU. Other than that iTunes.exe is always at 70-90% and never lets me do anything else. Does someone have a suggestion? EDIT: I have tried reinstalling 10.4.1 completely deleting my library and starting with a plain installation and that does not work I have tried downgrading to 10.3.x and that does not work either :(

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • netgear GS108TV2 RSTP configuration

    - by jhowland
    I have a large set of GS108TV2 units--my goal is to set up a network which is comprised of several loops for redundancy/fault tolerance. I have a minimal 3 switch loop configured, with RSTP enabled on two ports on each switch. I have my bridge max age set to 6, and my bridge forward delay set to 4, which are the minimum values allowed. Hello time is fixed at 2 seconds. The switches respond to a cable being removed from a socket, but it takes too long. I cannot get the switch to respond to a loss of connection on one of the redundant ports in less than 20 seconds. Is there any way to configure these switches to respond faster than 20 seconds? That is unacceptable for my application. thanks in advance for any help

    Read the article

  • Multiple FIle Upload in FTP Using CMD

    - by user697363
    I've large number of files, over 10,000, which I want to upload in FTP server. Now, I can't zip those files & upload it as I've to read those files individually in SAS software for my analysis. If I use mput commant, then the prompt as me to say "y" eact time it tries to upload the file. This is very cumbersome. Is there any method by which it automatically upload the files without I've to manually entering "y" each time file wants to upload. The command I was using was: ftp ftp.myftp.com *my ftp server name username:myusername password:mypassword ftp> lcd c:\local_folder ftp> mput *.*

    Read the article

  • How to secure a VM while allowing customer RDS (or equivalent) access to its desktop

    - by ChrisA
    We have a Windows Client/(SQL-)Server application which is normally installed at the customer's premises. We now need to provide a hosted solution, and browser-based isn't feasible in the short term. We're considering hosting the database ourselves, and also hosting the client in a VM. We can set all this up easily enough, so we need to: ensure that the customer can connect easily, and also ensure that we suitably restrict access to the VM (and its host, of course) We already access the host and guest machines across the internet via RDS, but we restrict access to it to only our own internal, very small, set of static IPs, and of course theres the 2 (or 3?)-user limit on RDS connections to a remote server. So I'd greatly appreciate ideas on how to manage: the security the multi-user aspect. We're hoping to be able to do this initially without a large investment in virtualisation infrastructure - it would be one customer only to start with, with perhaps two remote users. Thanks!

    Read the article

  • Hooking up many different external HDs simultaneously

    - by cbizz
    I need a large amount of external storage for an upcoming project. I'm planning on purchasing 10 2TB external drives. I need them all hooked up to a single machine at the same time. What issues will I run into? I plan on using 2 power strips and having them all externally powered from the wall. I will use a USB hub to plug in all the drives. I need drive access time to be as fast as possible. I am using Ubuntu Linux(64 bit). Will I be able to mount 10 drives?

    Read the article

  • how to protect php app (vbulletin) from hackers

    - by samsmith
    Our vBulletin system is under constant attack, raising cpu load and making the system very slow for legit users. The attack is a script type attack that is attempting to log in and/or create new login ids (mostly it is trying to create login ids in order to spam the site). In vBulletin, we have black listed large ranges of ips, which has helped a lot, but the attacks continue. Is there an automated way to protect the application or web server? ideally, the protection would detect the pages accessed and automatically black list the ip.

    Read the article

  • How to reduce the pain of the command prompt

    - by Adam
    I want to learn to use the command prompt better on Windows to have more control over what I do and just for the learning experience. The main annoyance I have right now is all of the typing. If I want to perform an operation on a file with a large path I'm sitting there typing it out for a minute at least, and if I make a mistake I have to press the up arrow key and scroll through the entire thing and find what I did wrong. Is there any tools to make this easier?

    Read the article

  • In-House DropBox

    - by beardedlinuxgeek
    Dropbox is perfect, but as a company, no one can host anything worthwhile on servers that we don't control. So I've been tasked with coming up with a Dropbox alternative, something in house. GlusterFS is nice, but no offline access. SparkleShare uses Git which isn't great for large files. It also doesn't have windows ports. Any other options? If I were to roll out my own from scratch, what do you think the based way to go about doing this would be?

    Read the article

  • Increase Volume of an MKV Video from Linux Terminal

    - by The How-To Geek
    I've got a large amount of .MKV video files which seem to all play at a very low volume - I end up having to turn the TV up all the way to hear them, which is really irritating when I switch to another channel and wake the dead because it's so loud. What I'm looking for is a command-line method to increase the volume (so I can run it on all of them quickly) that would hopefully work regardless of the audio codec in use in the particular file. (I don't mind hard-coding the output audio though). For reference, I'm using Ubuntu 9.04 on my server, and the files are being played back with Boxee on a Mac Mini, but the volume problem is the same on Windows too.

    Read the article

  • Backup/Multihomed network connection

    - by J_P
    We have a couple locations that require 24/7 access to Internet and our current provider (AT&T) while mostly good is not always up. My concern would be if I go with another provider (for example Comcast) I'm going to be subject to the same down time if it's in the "last mile". I for the most part don't know where the failure points are on the ISP side but I would imagine the large majority are within the last mile. I'd looked at Mifi or similar solution but have concerns about bandwidth caps and overall speed. Any suggestions would be appreciated.

    Read the article

  • How to calculate proper amount of inode/block sizes for a linux filesystem.

    - by Donatello
    I have an old reiser filesystem which I'm going to convert to Ext3. The problem I have is to determine the proper block- and inode-sizes for this partition. The partition is 44 GB large and has to hold 3,000,000+ files of sizes between 1 kb and 10kb, how can I figure out the best ratio of inodes and blocksize? The below is something I tried which seems OK but makes the copying files incredibly slow. mkfs.ext3 -t ext3 -c -c -b 1024 -i 4096 -I 128 -v -j -O sparse_super,filetype,has_journal /dev/sdb1 Thanks.

    Read the article

  • Should we regularly schedule mysqlcheck (or databsae optimization)

    - by scatteredbomb
    We run a forum with some 2 million posts and I've noticed that if left untouched the overhead in the mySQL (as listed in phpMyAdmin) can get quite large (hundreds of megabytes). I'm wondering if scheduling a normal mysqlcheck to optimize the tables is good practice? Any reason not to do it, say, once a week at an off-peak hour? There was a time over the summer where our site was constantly crashing because mysql was using up all resources. That's when I noticed the huge amount of overhead and optimized the database and haven't had any problems since then with stability. I figured if that was helping alleviate the issues, I should just setup a cron to automatically do this.

    Read the article

< Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >