Search Results

Search found 9067 results on 363 pages for 'big fizzy'.

Page 154/363 | < Previous Page | 150 151 152 153 154 155 156 157 158 159 160 161  | Next Page >

  • Massive amount of subfolders and long subfolders. ¿How can I delete all of them?

    - by Carlos
    Good day. We have a little problem here. We have a share with the backup of all the server's offices, Its a really big share with more than 8.000.000 files. Our users usually give long names to the folders they create, and then make subfolders (long too) and more subfolders... and more suboflders.... We have a new share with more capacity, and with a simpe robocopy bat we copied all the files and folders (some give problems, but we manually copied them) But the problem is deleting them. del command didnt work well when so long paths, neirder rmdir... I'm tried some commanders, but no luck. Can u recommend me any tool that can delete recursively or able to delete 255+ paths? Edited: The SO on background of the share it's NetApp OS. But I can access it from Windows Servers. 2000 and 2003 Thanks.

    Read the article

  • SQL Server 2008, not enough disk space

    - by snorlaks
    Hello, I'm executing sql query on my database. I have SQL Server 2008 installed on my D harddrive which has 55 GB free space. I have also C drive which has sth like 150 MB free (right now). While executing that query on quite a big table (16 GB) I have an error: An error occurred while executing batch. Error message is: Not enough disk space. I would like to know if there is any possibility that I can make SQL Server to use D drive instead of C Or maybe there is any other problem with what I'm doing ? Thanks for help

    Read the article

  • Looking for Full Screen Web-Browser in Vizio or Samsung TV's

    - by ScottCate
    I see that Google TV, inside Sony, has a Chrome browser. Is the same thing possible inside Vizio, or Samsung, or any other TV? The 42" Vizio is $499 at Costco, and the Sony with Google TV is $999. We're looking at a buch of these TV's that have build in Wi-Fi, to place around the office as Big Dashboards, and I don't want to have a computer attached to get web content on the screen. It's my understanding that both Samsung Apps, and Vizio Apps, use Yahoo Widgets as their ap engine. Anyone know of a way to get a full screen browser going, without attaching a computer? We thought of attaching an Apple TV, which can be jailbroken, but that is just another piece of equipment to lose, break, etc. Thank you!

    Read the article

  • Suggested benchmark for testing CPU footprint of antivirus software

    - by Alex Chernavsky
    Our organization is currently running Symantec Corporate Antivirus, which is rumored to be a big resource hog. I know that we do have a lot of older machines that are running slow. Our PCs are all running Windows XP Pro and are used only for business applications (mostly Microsoft Office), e-mail, and web surfing. They're not used for gaming (one would hope not, anyway). I'd like to take one of the old PCs and do a speed benchmark test while it's running Symantec AV, then another test with no antivirus, and a third test with ESET NOD32. As I said, I don't care much about graphics performance. What would be an appropriate benchmarking program program to use? Freeware is best, of course. Thank you for considering my question.

    Read the article

  • Archive Outlook mail items into SQL Server

    - by marc_s
    I am looking (and so far not finding any) for a solution to archive e-mail items from my Outlook into SQL Server. My PST is beginning to get really really big, and I'd love to extract my older e-mail into SQL Server in a way so I can still easily find mails if needed. I would prefer SQL Server as the storage medium since I'm familiar with it, and it's rock solid - I don't want to have a collection of PST files or CHM files or anything like that. Does anyone know of such a solution? I'm a power/home user - I can't afford $5'000 enterprise licenses - I need a sub-$100 solution for private use.

    Read the article

  • How do I adjust the actual font sizes in Internet Explorer?

    - by Cyberherbalist
    I like having as much stuff on the screen at once as I can. But as a consequence of getting on in years, small font sizes are starting to become a problem. Internet Explorer has the ability to increase the font size of text in webpages (those pages that don't set the font size explicitly). On the main menu, it is View | Text Size. The choices are: Smallest, Smaller, Medium, Larger, Largest. Medium is on many web pages just a tad too small to me. But if I go to Larger the font size is way too big. My question is, is there a way to set these text size jumps differently from what they seem to be?

    Read the article

  • Moving files from Public folder to C: takes a minute, even though they are same hard drive and same

    - by Jian Lin
    I have a big file, like 2GB, and would like to move it from Network -> Bookroom -> Users -> Public (this is the computer in the bookroom in the house) to c:\myfiles and they are actually on the SAME hard drive (and same partition). But copying still takes a minute or so? I thought if on the same hard drive and partition, then it is a "move" and it should take 2, 3 seconds only. that public folder also is \\Bookroom\Users\Public Update: Sorry, I actually mean "move" all the way... so it is not copy but move. So that's why I thought it should take 2, 3 seconds only.

    Read the article

  • How can I create multiple identical AWS EC2 server instances with large amounts of persistent data?

    - by mojones
    I have a CPU-intensive data-processing application that I want to run across many (~100,000) input files. The application needs a large (~20GB) data file in order to run. What I would like to do is create an EC2 machine image that has my application and associated data files installed boot up a large number (e.g. 100) of instances of this image split my input files up into 100 batches and send one batch to be processed on each instance I am having trouble figuring out the best way to ensure that each instance has access to the large data file. The data file is too big to fit on the root filesystem of an AMI. I could use Block Storage, but a given Block Storage volume can only be attached to a single instance, so I would need 100 clones. Is there some way to create a custom image that has more space on the root filsystem so that I can include my large data file? Or is there a better way to tackle this problem?

    Read the article

  • Using Pidgin 2.5.2 in Linux no sound is made for incoming messages (as it should)

    - by Kent
    Hi, I have a problem with Pidgin 2.5.2 in Linux (Ubuntu 8.10). When someone sends me a message no sound is played (the tray indicates a new message and blinks, that's it). Sounds play fine when I send someone a message. If I preview the Message received and Message sent sound events both of them do make a sound. Automatic and ALSA is what's working from the alternatives in Sound method selection. I include a screenshot containing a lot of relevant information: Screenshot (It's to big to fit nicely inline.)

    Read the article

  • Windows 2008 VPS always crashes when out of disk space

    - by Pickels
    Hello, I am renting a Windows server 2008 dc SP2 VPS for hosting my Asp.Net projects. Now for the second time this month my VPS ran out of disk space. The first time it was a log file that got to big and yesterday it was my mistake for uploading a website without noticing the lack of space on my VPS. Now the side effect this has is that my VPS corrupts some files when trying to write them. Last time it was Plesk that stopped working yesterday it was IIS. So I was wondering is this normal behavior? I called my service provider to ask if they could restore a back-up and to ask if this is normal and they ensured me it was. I am not trying to blame them and I know it's mostly my fault for not monitoring my VPS better or for not setting better defaults.

    Read the article

  • SonicWall HA "gotchas"?

    - by Mark Henderson
    We're looking to move away from PFSense and CARP to a pair of SonicWall NSA 24001 configured in Active/Passive for High Availability. I've never dealt with SonicWall before, so is there anything I should know that their sales guy won't tell me? I'm aware that they had an issue with a lot of their devices shutting down connectivity because of a licensing fault, and they have an overtly complex management GUI (on the older devices at least), but are there any other big "gotchas" that I need to be aware of before committing a not insubstantial amount of money towards these devices? 1If you're outside the US, the SonicWall global sites suck balls. Use the US site for all your product research, and then use your local site when you're after local information.

    Read the article

  • Windows 7 can't copy file - Error 0x800700DF: The file size exceeds the limit allowed and cannot be saved

    - by JJGroover
    Any attempt to copy files larger than about 40 MB from a network share (a SAN running open filer / Samba) to my local machine running Windows 7 always results in the following error and the copy fails: Error 0x800700DF: The file size exceeds the limit allowed and cannot be saved. I've tried copying to my C: drive and a USB drive with the same results. Smaller files copy just fine. Clearly 40 MB is not that big of a file so I'm assuming it is some buggy interaction between windows 7 and Samba perhaps. Google has so far turned up nothing. Can anyone point me in the right direction?

    Read the article

  • TCP video streaming: TCP throughput(rate) and RTT

    - by misteryes
    we know that a rough estimation of TCP rate is: WINDOW/RTT, where WINDOW is the min(CWIN, RWIN), CWIN is the congestion window size of the sender, while RWIN is the receiving window. nowadays, the encoding rate of videos may be 1000KB/s(8000kbit/s), if RTT is 500ms, it needs the window size to be 2000KB. But we know that usually the receiving window size is below 64KB, there is a big gap. so if RTT is too large, TCP streaming is not possible? Is my understanding right? thanks!

    Read the article

  • SSL certificates with password encrypted key at hosting provider

    - by Jurian Sluiman
    We are a software company and offer hosting to our clients. We have a VPS at a large Dutch datacenter. For some of the applications, we need an SSL certificate which we'd like to encrypt with a password protected keyfile. Our VPS reboots now and then because of updates whatsoever, but that means our apache doesn't start right away because the passwords are needed. This results in downtime and is of course a real big problem. We can give the passwords to our VPS datacenter, or create certificates based on keyfiles without passwords. Both solutions seem not the best one, because they compromise the security of our certificates. What's the best solution for this issue?

    Read the article

  • Routing Essentials

    - by zharvey
    I'm a programmer trying to fill a big hole in my understanding of networking basics. I've been reading a good book (Networking Bible by Sosinki) but I have been finding that there is a lot of "assumed" information contained, where terms/concepts are thrown at the reader without a proper introduction to them. I understand that a "route" is a path through a network. But I am struggling with visualizing some routing-based concepts. Namely: How do routes actually manifest themselves in the hardware? Are they just a list of IP addresses that get computed at the network layer, and then executed by the transport? What kind of data exists in a so-caleld routing table? Is a routing-table just the mechanism for holding these lists of IP address (read above)? What are the performance pros/cons for having a static route, as opposed to a dynamic route?

    Read the article

  • Solaris TCP/IP performance tuning

    - by Andy Faibishenko
    I am trying to tune a high message traffic system running on Solaris. The architecture is a large number (600) of clients which connect via TCP to a big Solaris server and then send/receive relatively small messages (.5 to 1K payload) at high rates. The goal is to minimize the latency of each message processed. I suspect that the TCP stack of the server is getting overwhelmed by all the traffic. What are some commands/metrics that I can use to confirm this, and in case this is true, what is the best way to alleviate this bottleneck? PS I posted this on StackOverflow originally. One person suggested snoop and dtrace. dtrace seems pretty general - are there any additional pointers on how to use it to diagnose TCP issues?

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

  • How to determine if a file has been backed up?

    - by Console
    I try to consolidate old drives to new ones of larger capacity. Sometimes files have been renamed, but are otherwise identical. Sometimes an old directory has just a few more files in it than a newer directory with the same name. Sometimes a file has the same name but the size differs. So I often find myself asking the question: Are there any files on this old drive or directory that I haven't already copied to the new drive? I just want to know that I have the files, I don't want to try and sync stuff automatically (Syncing tools tend to just sync, creating duplicate folder structures and other problems, so I prefer to do it by hand). Basically, if an old drive has a file called "foo.bar" ten directories deep, and my new big drive has an identical file called "oldstuff.zip" in the root, I just want a "yes you have it" or "no, unique files exist". Is there a free tool, a script or a quick and easy method (Mac/Unix or Windows) to get the answer?

    Read the article

  • PostgreSQL RAID configuration

    - by Yoldar-Zi
    I'm stuck how best to configure disk array. We have Hp P2000 G3 disk array with 24 SAS physical disks 300Gb each. We need to configure this array got 2 copies of PostgreSQL 9.2 because two different system. As we know it's recommended to store database and transaction logs(pg_xlog) files on separate disks. So we must setup 4 logical disk: 2 for transaction logs with RAID 1 2 for database with RAID 10 Is this right scheme of distribution? Or may be it is best to just make one big RAID 10 with 4 logical disks?

    Read the article

  • Serverlocation moved and how can I Move the files

    - by Bernhard
    Hello together, I´ve a big problem. I have to move data from an old Webspace which is only accessibla by ftp. No we have a new root server which is accessible by ssh of course :-) No i Need to move all data from the old space but there is a lot of Gb of files. Is there a way to fetch all files directly from the old ftp to the storage and not over a third station (my local machine)? I´ve tried it with ftp but without success. I think I´ve used the wrong commands. Is there a way to etablish something like this including all files and directorys? Thank you in advance Bernhard

    Read the article

  • Transfer many Gigabytes between two servers

    - by Bernhard
    Hello, I have a big problem. I have to move data from an old Webspace which is only accessibla by ftp. The new root server is accessible by ssh of course :-) I need to move all the data from the old space but the amount is just huge. Is there a way to move all the files directly from the old ftp to the storage and not over a third station (my local machine)? I´ve tried it with ftp but it didn't work. I think I´ve used the wrong commands. Is there a way to do this? Thank you in advance Bernhard

    Read the article

  • Serving static web files off a non-standard port

    - by Nimmy Lebby
    I'm close to deploying a Django project to production. I'm looking over some infrastructure decisions. Something that came up was serving static files with a different server such as lighttpd. However, we're starting off with a single dedicated server so our only option would be to use a non-standard port for the static file webserver. Is there precedence for this? I.e. Does anyone "big" do this? Any particular port I should use or shy away from using? Can anyone thing of some downsides of going this route?

    Read the article

  • Filesystem to quickly get recent modifications

    - by liori
    Hello, I've got relatively big filesystem (ext4) with lots of small files and I'd like to backup it. Making full backups often is not feasible to me so I want to have a way to make differential/incremental backups (differential preferred). But... this is laptop, and scanning for changed files takes lots of time. My questions: 1) Is it possible to get list of files changed since some date from ext4's journal? I know it wasn't designed with this idea in mind, and it might be too small for bigger timespans, but maybe it is somehow possible? 2) Is it possible to monitor filesystem modifications and maintain a list of changed files reliably? I think I could use inotify, but this might be too slow to monitor full filesystem and might be unreliable. (by reliable I mean either I get all modifications since last backup (and this list is not missing anything) or an error message). Laptop runs Debian unstable.

    Read the article

  • Laptop Backup Software (Corporate)?

    - by Hutch
    I wondered if any of you who have a fleet of laptops are using anything to back them up, and if so what? In particular I'm looking for a solution that is totally hands-off once installed i.e. the user doesn't have to do anything, press anything, remember to change something when their domain password changes etc. Right now we use Druva Insync which I have to say is pretty damned good, however our license is up for renewal in a couple of months so I want to be sure it's the best solution before renewing - the only other vaguely comparable product that I know of is from Atempo but the cost of a SQL Server license is a big problem there. Thanks.

    Read the article

< Previous Page | 150 151 152 153 154 155 156 157 158 159 160 161  | Next Page >