Search Results

Search found 18409 results on 737 pages for 'large projects'.

Page 85/737 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • A simple Volume Replication Tool for large data set?

    - by Jin
    I'm looking for a solution to the following: Server A (Site A) - Win 2008 R2 - approx 10TB (15TB max) of data - well over 8 million files Server B (Site B) - Win 2008 R2 I want to assynchronously replicate Server A's volume to a volume on Server B for data redundancy. Something that I can say to my users, "go here for data" when/if Server A goes belly up due to machine problems, disaster, etc. Windows 2008 R2 does have DFS, but microsoft does not apparently support this large of a dataset (or more accurately, more than 8 million files - according to the docs I could find). I also looked at Veritas Volume Replication, but this seems almost too much as I would also require Veritas Volume Manager. There are numerous "back-up" software which makes a 1-1 backup, which would be ok, but since it will be transfering over internet, I'd like something that has compression during transfer like DFS has. Does anyone have any suggestions regarding this?

    Read the article

  • How do I protect large file downloads through PHP and/or Apache?

    - by Eric
    We have some large files (1-8GB) that are not publicly accessible. Currently we're serving them up through a PHP script that buffers the files in 1MB chunks and writes it to the output. It's incredibly CPU intensive and slows the server down when only a few downloads are active. We want to move the file transfer work to Apache or a more efficient method. We are using cookie authentication. FTP downloads are out unless there's some way to authenticate FTP sessions through the existing PHP session cookie. Ideally we'd like something where we can use PHP to hide the link to the file while it passes off the file transfer work to Apache, which is no doubt far more efficient at HTTP file transfers than PHP. We want to be able to resume downloads as well. Any help is appreciated.

    Read the article

  • Why is Internet access and Wi-Fi always so terrible at large tech conferences?

    - by Joel Spolsky
    Every tech conference I've ever been to, and I've been to a lot, has had absolutely abysmal Wi-Fi and Internet access. Sometimes it's the DHCP server running out of addresses. Sometimes the backhaul is clearly inadequate. Sometimes there's one router for a ballroom with 3000 people. But it's always SOMETHING. It never works. What are some of the best practices for conference organizers? What questions should they ask the conference venue or ISP to know, in advance, if the Wi-Fi is going to work? What are the most common causes of crappy Wi-Fi at conferences? Are they avoidable, or is Wi-Fi simply not an adequate technology for large conferences?

    Read the article

  • What is the Reason large sites don't use MySQL with ASP.NET?

    - by Luke101
    I have read this article from highscalability about stackoverflow and other large websites. Many large high traffic .NET sites such as plentyoffish.com, mysapce and SO all use .NET technologies and use SQL SERver for their database. In the article it says SO said As you add more and more database servers the SQL Server license costs can be outrageous. So by starting scale up and gradually going scale out with non-open source software you can be in a world of financial hurt. I don't understand why don't high traffic .NET sites convert their databases to MySQL as it is waay cheaper then SQL Server

    Read the article

  • Is my large Windows folder slowing down my machine?

    - by Moses
    I have a problem with my Windows installation running very slow and my Windows folder being too large. I thought that the problems are related. My Windows folder is 17.4 GB I have 1807 folders totalling 2.4 GB that are prefaced with a $. My System32 folder is 1.55 GB My Microsoft.NET folder is 654 MB – I don't know what if any programs I have that are using it. My Service Pack folder is 568 MB. The Software Distribution folder is 536 MB The ie8updates folder is 380 MB. How can I reduce the size of these folders and could their size be why I am running do slow?

    Read the article

  • Which Large File System Format to use for USB Flash drive compatible with Ubuntu/Mac/Windows?

    - by wajiw
    I've had this problem for a long time and can't find a solution. I switch between the 3 OSes all the time and use a 1TB USB Drive to do so. I can't seem to find a format that is compatible across all systems that handles large files (at least 8-9 GB). Does anyone have a solution for this? Recently I've tried exFat but that messes up the filesystem when trying to read on windows after adding files from Ubuntu (using the fuse driver). The OSes currently I'm using are Windows Vista/7, Mac OS X (10.6.5) and Ubuntu 10.10

    Read the article

  • What is the easiest way to reference libraries in Qt projects?

    - by Jake Petroules
    I have two Qt4 Gui Application projects and one shared library project, all referenced under a .pro file with the "subdirs" template. So, it's like: exampleapp.pro app1.pro app2.pro sharedlib.pro Now, what I want to do is reference sharedlib from app1 and app2 so that every time I run app1.exe, I don't have to manually copy sharedlib.dll from its own folder to app1.exe's folder. I could set PATH environment variable in the projects window, but this isn't very portable. I've looked at putting the LIBS variable in the app1.pro file, but I'm not sure if that refers to statically linked libraries only - I've tried it with various syntaxes and it doesn't seem to work with shared libs.

    Read the article

  • How can I share Configuration Settings across multiple projects in Visual Studio?

    - by Muneeb
    Ok I know this may be a design issue, so I would love to have remarks on that as well. I have a Visual Studio web application solution. I have three projects as UserInterface, BusinessLogic and DataAccess. I had to store some user defined settings and I created configSections in the config file. I access these configSections through classes which inherit from .NET's ConfigurationSection base class. So in short for every project I had a separate configSection and for that corresponding configSection I had a class in that project inheriting from ConfigurationSection to access the config section settings. This works all sweet. But the problem arises if there is any setting which I need to use across multiple projects. So If I need to use a setting defined in UserInterface project configSection in, let say, BusinessLogic project I have to actually make a copy of that setting in the BusinessLogic's configSection. This ends up having the same setting copied across multiple configSections. Isn't this a bit too redundant?

    Read the article

  • Where can I find a large body of *Python3* source code?

    - by Ira Baxter
    I'm testing a Python parser. I have Python 2.6/2.7 firmly under control, and some good (large) code samples on which I've tested it. I'm interested in testing my Python3 variant. I've been to various Python open source web sites (e.g., http://pythonsource.com/), which list lots of packages, but they are pretty unclear as whether these are Python 2.x vs 3.x source files. The several samples that I downloaded all turned out to be Python 2.x. Where can I find a number of large Python 3 software source codes? I don't really want 1000 little separate Python3 files; I prefer big applications.

    Read the article

  • Why does cpio say "WARNING! These file names were not selected" when copying a large number of files

    - by mmm bacon
    For over 10 years, I've been using this strategy to copy a large number of files between UNIX filesystems: cd source_directory find . -depth -print | cpio -pdm /path/to/destination_directory It works like a champ. However, I'm now getting this error from cpio: cpio: WARNING! These file names were not selected: (long list of files here...) The source directory is on OSX 10.5, and the destination directory is a NFS filesystem from an OpenSolaris server. Copying over NFS has never been a problem in the past. There's nothing strange about the filenames, meaning there aren't special characters or anything like that. Any ideas?

    Read the article

  • Can a single solution hold projects from multiple repositories?

    - by cyclotis04
    I've begun setting up SVN repositories to store my code, and am wondering if a single Visual Studio solution can have projects from multiple repositories. I have a shared library with different helper functions, generic custom controls, etc, that are used by multiple projects, and hosted in its own repository. Then I have my project repository, which contains all of the program-specific code such as forms, etc. I know I could copy the shared library into the program's repository, then copy them back when I make changes, but I'd much rather keep them in different repositories so I can hit "Commit" and the general library commits to it's repository, and the program code commits to it. I'm currently using AnkhSVN, but if it's possible with other tools, I'll look into it. Preemptive clarification for all the "just use one repository" answers: The shared library is hosted in an online repository, viewable by anyone, but the program code is proprietary and resides on our office servers, so they need different repositories.

    Read the article

  • How do large companies handle software updates for users without administrative rights?

    - by CT
    I just started working for a small-medium size company doing IT support. Maybe 150 or less users. Right now every user has administrative rights to their own machine. This allows them to install updates or whatever else they would like to. I'm tired of getting on user's machines that are bloated with crap they put on themselves. So my first thought would be to take away administrative rights to their computer. This would also have other advantages such as preventing a lot of drive-by malware on the web etc. The problem arises that users are unable to install updates. (Even though I find most ignore these anyway) How do large companies handle software updates on all client machines? EDIT: Windows environment. Most servers are Windows Server 2003 Enterprise. Clients are all Windows. Win XP, Vista, and 7.

    Read the article

  • MySQL LEFT JOIN error

    - by Alex
    Hello, I've got some SQL that used to work with an older MySQL version, but after upgrading to a newer MySQL 5 version, I'm getting an error. Here's the SQL: SELECT portfolio.*, projects.*, types.* FROM projects, types LEFT JOIN portfolio ON portfolio.pfProjectID = projects.projectID WHERE projects.projectType = types.typeID AND types.typeID = #URL.a# ORDER BY types.typeSort, projects.projectPriority ASC and the new error I'm receiving: Unknown column 'projects.projectID' in 'on clause' How can I convert this to compatible SQL for the newer MySQL version? Thanks very much!

    Read the article

  • File storage service that allows clients to upload large files to my account?

    - by deceze
    Can anyone recommend an online file storage service which fulfills these requirements? I can create an account I can invite clients to upload files into my account clients do not need to register to be able to upload clients must not be able to see anything but their own files or they must not see any files at all, they get only a dropbox only I can access the uploaded files, everything is non-public service is multi-lingual I just need clients to be able to send me potentially large files in a dead simple manner online, that's all. No registration step to go through, no software to download, no synching or sharing. No setting up of individual folders and permissions for each individual client. No copying and pasting of links (a la Mediafire, Rapidshare etc).

    Read the article

  • How to save a HTMLElement (Table) in localStorage?

    - by Hagbart Celine
    I've been trying this for a while now and could not find anything online... I have a project, where tablerows get added to a table. Works fine. Now I want to save the Table in the localStorage, so I can load it again. (overwrite the existing table). function saveProject(){ //TODO: Implement Save functionality var projects = []; projects.push($('#tubes table')[0].innerHTML); localStorage.setItem('projects', projects); //console.log(localStorage.getItem('projects')); The problem is the Array "projects" has (after one save) 2000+ elements. But all I want is the whole table to be saved to the first (or appending later) index. In the end I want the different Saves to be listed on a Option element: function loadSaveStates(){ alert('loading saved states...'); var projects = localStorage.getItem('projects'); select = document.getElementById('selectSave'); //my Dropdown var length = projects.length, element = null; console.log(length); for (var i = 0; i < length; i++) { element = projects[i]; var opt = document.createElement('option'); opt.value = i; opt.innerHTML = 'project ' + i; select.appendChild(opt); } } Can anyone tell me what I am doing wrong?

    Read the article

  • How well does Solr scale over large number of facet values?

    - by Continuation
    I'm using Solr and I want to facet over a field "group". Since "group" is created by users, potentially there can be a huge number of values for "group". Would Solr be able to handle a use case like this? Or is Solr not really appropriate for facet fields with a large number of values? I understand that I can set facet.limit to restrict the number of values returned for a facet field. Would this help in my case? Say there are 100,000 matching values for "group" in a search, if I set facet.limit to 50. would that speed up the query, or would the query still be slow because Solr still needs to process and sort through all the facet values and return the top 50 ones? Any tips on how to tune Solr for large number of facet values? Thanks.

    Read the article

  • Creating a file server - How can I use a large VHD file in Hyper-V? (700GB)

    - by barfoon
    Hey everyone, After a few discussions (here, here, and here), I am still unable to create a simple VM that will be used as a fileserver hosted on my Hyper-V box. I have created a fixed 700GB SCSI drive (.vhd file), as I have learned an IDE drive of this size is not possible. Not to sound too cynical, but its blown me away at how much trouble its been to create a large amount of space and start using it. What is the best way to create a fileserver with a drive of this size hosted on Hyper-V Server 2008, and how can I get it going??? Inclusion of OS, driver, integration tools etc, anything you feel is required would be greatly appreciated. Extra information I am using the stand-alone version of Hyper-V server, and not Windows Server 2008. I have tried loading the Linux Integration Tools (linked in the comments of the last link above) onto a SUSE 11 VM and the installation fails, the machine cannot see the vhd at all. Thanks very much,

    Read the article

  • Why do open source projects cling on 0.x versions for too long?

    - by ssg
    I see many open source projects insist on staying in 0.xxx version for a very long time despite that the product has been proven useful and very stable. Trac is one example. They even risked switching from 0.9 to 0.10 which might confuse a lot of users about which is more recent. I wonder if this is a cultural paradigm, an honor code in open source community or simply a strict interpretation of release cycle management? Would a person who releases first version as "1.0 beta" be banished from open source world, or more realistically appeal less number of contributors? For some projects it even looks like they will never switch to 1.0 ever but only approximating only half way each time, like Zeno's paradox.

    Read the article

  • How can should I set up Flex 3 projects that reference common controls?

    - by Amy
    I'm not a flash developer, I'm having issues figuring out how I should set up these two projects that I have in Flex Builder. I've already created projA which has a .mxml that references several custom controls & skins from com.xxx.controls within projA I now have to build projB which also has a .mxml that will create a different .swf. I want to use some of the same controls from projA I currently build projA through the command line and nant and will need to do the same for projB. Should I create a new project to move all of the common controls into? How do I then use this library project in both the projects & compile via command? Thanks!

    Read the article

  • Fast way to perform addition of 2 LARGE float arrays in Android. Optional JNI or Opengl ES

    - by nathan
    I simply need to add floatArray1 to floatArray2 storing the result in floatArray2.. no third array.. all arrays are one dimensional but are very large... probibly as large as the os will let me get away with. Max i would need is two float arrays with 40,000 floats each... but i could get away with 1/10th that i suppose minimum. Would love to do this in 1/30th or 1/60th of a second but that does not seem possible? Also if the code is JNI,NDK or OpenGL ES thats fine.. does android have an assembly language or like machine code i could use somehow?

    Read the article

  • How can I send super large files directly to another computer in the Internet for free?

    - by Cruise
    I regulary need to transfer very large files (30 GB) to my friend - financial statistics. I don't have any problem with bandwidth: it is very broad here. I did some research in the area, so: 1. I would not use FTP, as it is very tricky to get it working behind a NAT. 2. I would not use Skype/MSN/ICQ, as it is not designed for file transfer and it underperforms on the huge files. 3. I would not use file-sharing services, as I need to pay for big files (30 GB is a problem here) and I don't like holding any piece of my data on the third-party server. So, I need some smart tool that will do what I need: sending files directly browser-to-browser and not browser-server-browser. Is it so complex? Is there some web application in the Internet that can do this?

    Read the article

  • How could I portably split large backup files over multiple discs?

    - by sourcejedi
    Context: I make backups / archives, primarily of photos. I'm experimenting with Bup, which is designed for backup to hard disk. Basically it creates Git repos which include packfiles of up to 1GB. But I still need last-ditch backups to keep offline and move offsite (and keeping them on read-only media is good too!). What are the options for archiving and splitting large files over several discs like CDs (and reading them back!)? I'd prefer methods which will stay readable in future. are portable e.g. to Windows. have known simple implementations, so I could re-implement them myself if necessary. (Using Bup packs will stretch my robustness budget. So I want to be confident about how other parts of the system would behave). I heard split archives are possible with both ZIP and 7-Zip. Is that right?

    Read the article

  • Copying a large directory tree locally? cp or rsync?

    - by Rory
    I have to copy a large directory tree, about 1.8 TB. It's all local. Out of habit I'd use rsync, however I wonder if there's much point, and if I should rather use cp. I'm worried about permissions and uid/gid, since they have to be preserved in the clopy (I know rsync does this). As well as thinks like symlinks. The destination is empty, so I don't have to worry about conditionally updating some files. It's all local disk access, so I don't have to worry about ssh or network. The reason I'd be tempted away from rsync, is because rsync might do more than I need. rsync checksums files. I don't need that, and am concerned that it might take longer than cp. So what do you reckon, rsync or cp?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >