Search Results

Search found 37650 results on 1506 pages for 'files'.

Page 438/1506 | < Previous Page | 434 435 436 437 438 439 440 441 442 443 444 445  | Next Page >

  • Why is the server performance so poor? What can be done to improve the speed of the server?

    - by fslsyed
    Very slow processing using Windows Server2008 R2 Standard with Service Pack One. Situation: Read a text file using the text data to populate a series of MS Sql tables. The converted data is used to generate monthly PDF invoice files; the PDF files are saved directly to the hard drive. The application is multi-threading with one thread used for the text conversion and three threads for PDF invoice generation. The text conversion is occurring concurrently with the invoice generation. Application Software: C# using Microsoft Visual Studio 2010 Ultimate. Crystal Report Writer 2011 with runtime 13_0_3 64 bit version. Targeted platform is x64; also tested as x86, and Any CPU with similar results. Microsoft .NET Framework 4.0. Microsoft Sql 2008 Issue: The software is running very slowly. The conversion of the text file is approximately six hundred fifty records per second and generation of the PDF files is approximately twelve invoices per minute. The text file to be converted is six hundred Meg with seven thousand invoices to be generated. The software was installed on three different machines from the same distribution files. The same text file was converted on each machine. The user executing the application was an administrator on each machine. The only variances were the machine and operating system. The configurations are as follows: Server: Operating System: Windows Server2008 R2 Standard 64-bit (6.1, Build7601) SP1 Service Pack: System Manufacturer: IBM System Model: System x3550 M3-[7944AC1]- BIOS: Default System BIOS Processor: Intel® Xeon® CPU E5620@ 2.4GHz (16 CPUs) Memory: 16384MB Notebook: Operating System: Windows 7 Home Premium Standard 64-bit (6.1, Build7601) System Manufacturer: Hewlett-Packard System Model: HP Pavilion dv7 Notebook PC BIOS: Default System BIOS Processor: AMD Phenom II N640 Dual-Core Processor 2.9GHz (2 CPUs) Memory: 6144MB Desktop: Operating System: Windows 7 Professional 64-bit (6.1, Build7601) SP1 System Manufacturer: Dell Inc. System Model: OptiPlex 960 BIOS: Phoenix ROM BIOS PLUS Version 1.10 A11 Processor: Intel Core™2 Quad CPU Q9650 @3.00GHZ (4 CPUs) Memory: 16384MB Processing results per machine: The applications were executed seven times with the averages being displayed below. Machine Text Records Invoices Generated Converted Per Minute Per Minute Server (1) 650 12 Notebook 980 17 Desktop 2,100 45 (1) The server is dedicated to execution of this application; no additional applications are being executed. Question: Why is the server performance so poor? What can be done to improve the speed of the server?

    Read the article

  • Darwin Streaming Server VOD playlist?

    - by StackedCrooked
    I have a large MPEG4 file split up into a number of smaller files. I would like to make these files available as one movie. So I need something like a playlist. It seems the playlist option for live streaming is not available for VOD. Is there a way to get this functionality somehow?

    Read the article

  • Browser caching issue on a https site pressing f5

    - by sushil bharwani
    i am working on a website where i have content entry form. This form contains a tiny mce control. The control is composed of some 40-50 files. The testing reported that the entry form loads slow and evertime shows up 50 files loading to completely load the page. Is there a way i can decrease this time. I have taken help of browser caching by setting the expires header of static content to very far date. When i access the form through its link second or later times it loads fast without saying 40 files remaining. but when i do f5 it reloads the entire page. I m confused as to how is f5 different from clicking on the link. Just to add my url is https.Any suggestion to increase the performance of this form will be great.

    Read the article

  • Embedding an existing exe file into another C++ program

    - by Milad
    Is there a way to link an existing .exe file with other C++ source files during compilation? What I'm actually trying to do is to compress and decompress some files in my console program using LZMA(7zip) SDK but unfortunately it's very difficult to use for a newbie. There is a command line version of LZMA called 7za.exe and I am wondering if I can somehow embed it into my program and use it like a function. It can be easily used with system() function (which seems to be a very dangerous thing to use) but then if I send my program to someone who doesn't have 7za.exe in the right folder it won't work. I came across CreateProcess() function in windows.h header files but it seems to achieve what system() does in a more proper and advanced way. I don't know if it can actually link the exe file like an object file during compilation

    Read the article

  • Trying to use the 7-zip self extracting archive GUI and it fails [closed]

    - by djangofan
    Trying to use the 7-zip self extracting archive GUI and it fails. When I try to create a "Self Extracting Installer" option, at the end of the install it runs my batch file and it appears to be extracting all the files just before it does that, but after extraction, the files are nowhere to be found (except within the .7z archive). Any idea on why this occurs? https://code.google.com/p/sfx-creator/

    Read the article

  • Which web server architecture do you think is better?

    - by ngache
    use apache to server dynamic requests that need to be processed by php,and use nginx to serve static files use nginx to serve all requests So the key point is: which of them is more efficient in serving dynamic requests(we have no doubt that nginx is much better than apache in serving static files)?

    Read the article

  • server 2008 r2 - wbadmin systemstatebackup - system writer not found in the backup

    - by TWood
    I am trying to manually run a systemstatebackup command on my server 2008 r2 box and I am getting an error code '2155347997' when I view the backup event log details. The command line tells me that I have log files written to the c:\windows\logs\windowsserverbackup\ path but I have no files of the .log type there. My command window tells me "System Writer is not found in the backup". However when I run vssadmin list writers I find System Writer in the list and it shows normal status with no last errors stored. I am running this from an elevated command prompt as well as from a logged on administrator account. My backup target path has permission for network service to have full control and it has plenty of free space. Looking in eventlog I have two VSS error 8194 that happen immediately before the Backup error 517 which has the errorcode 2155347997 listed. All three of these errors are a result of trying to run the command for the systemstatebackup. It's my belief that some VSS related permission is failing and exiting the backup process before it ever gets started. Because of this the initial code that creates the log files must not be running and this is why I have no files. When running the systemstatebackup command from the command prompt and watching the windowsserverbackup directory I do see that I have a Wbadmin.0.etl file which gets created but it is deleted when the backup errors out and stops. I have looked online and there are numerous opinions as to the cause of this error. These are the things I have corrected to try and fix this issue before posting here: Machine runs a HP 1410i smart array controller but at one time also used a LSI scsi card. Used networkadminkb.com's kb# a467 to find one LSI_SCSI entry in HKLMSysCurrentControlSetServices which start was set to 0x0 and I modified to 0x3. No changes. In HKLMSystemCurrentControlSetServicesVSSDiag I gave network service full control where it previously only had "Special Permission". No changes. I followed KB2009272 to manually try to fix system writer. These are all of the things I have tried. What else should I look at to resolve this issue? It may be important to note that I run Mozy Pro on this server and that was known in the past to use VSS for copying operations and it occasionally threw an error. However since an update last year those error event log entries have stopped.

    Read the article

  • Robocopy with local catalog of remote data for incremental backup

    - by Bill
    I am currently using robocopy to an extremely slow destination. The compare between source and destination files can take a while to run through. Since the destination will never change (apart from the robocopy changes), is there any program that will work similarly to robocopy, but have a local list of what files (attributes and timestamps) the destination has, to compare with? I know there are expensive solutions which may do this, but I'm looking for something free if possible. Hopefully this makes sense.

    Read the article

  • How do I use TrueType fonts with LaTeX

    - by harper
    I need to use a font family in my LaTeX documents, that is available as 18 .TTF (TrueTypeFont) files. Where do I have to copy the files in my MiKTeX 2.8 installation? How Do I make the fonts to available for LaTeX? I usually use pdfLaTeX. I read in Truetype-Fonts in LaTeX that TTF fonts are available without creating all the .TFM files. What is necessary for this case? Can I install the fonts in the local-texfm directory? I would like to isolate the system installation and my manually added stuff. Probably it would be easier to copy this font family on another installation.

    Read the article

  • Sourcing local .bashrc .vimrc without copy to remote machine

    - by David Strejc
    Does anyone have an idea or hack on how to source my local dotfiles (I will probably need more of them so this solution should work with many files) on remote machines without scp them to remote machine? Is something like scp .bashrc to /tmp folder on remote machine and then exporting BASHRC env variable the best solution? I need this because of our company policy and fast cloud servers deployment and redeployment and I don't want to touch .bashrc files on remote machine so my colleges are able to use their default env which doesn't suit me.

    Read the article

  • Custom Build Step Paths Between x86 and x64 in Visual Studio

    - by Bob Somers
    For reference, I'm using Visual Studio 2010. I have a custom build step defined as follows: if exist "$(TargetDir)"server.dll copy "$(TargetDir)"server.dll "c:\program files (x86)\myapp\server.dll" This works great on my desktop, which is running 64-bit Windows. However, when I build on my laptop, c:\Program Files (x86)\ doesn't exist because it's running 32-bit Windows. I'd like to put in something that will work between both editions of Windows, since the project files are under version control and it's a real pain to change the paths every time I work on my laptop. If this were a *nix environment I'd just create a symlink and be done with it. Any ideas?

    Read the article

  • Rsync --backup-dir seems to be ignored

    - by Patrik
    I want to use rsync to backup a directory from a local location to a remote location, and store changed files in another remote location. I did use: rsync -rcvhL --progress --backup [email protected]:/home/user/Changes/`date +%Y.%m.%d` . [email protected]:/home/user/Files/ The --backup-dir stays empty, while it should be filled. Is it possible what I try to accomplish, and am I doing something wrong? Thanks

    Read the article

  • sharing web user controls across projects.

    - by Kyle
    I've done this using a regular .cs file that just extends System.Web.UI.UserControl and then included the assembly of the project that contains the control into other projects. I've also created .ascx files in one project then copied all ascx files from a specified folder in the properties-Build Events-Pre-build event. Now what I want to do is a combination of those two: I want to be able to use ascx files that I build in one project, in all of my other projects but I want to include them just using assembly references rather than having to copy them to my "secondary" projects as that seems a ghetto way to accomplish what I want to do. It works yes, but it's not very elegant. Can anyone let me know if this even possible, and if so, what the best way to approach this is?

    Read the article

  • How to display image that is located on another server on the network (ASP.NET)

    - by eviljack
    I've got a ASP.NET site that's located on a local server (MY_SERVER). And one of the things it does is pull up tiff files which are located on another server (ANOTHER_SERVER). The location of each of these files is stored in SQL. I pull up each of these images and am supposed to display them. The problem is: the files are not named with a tiff extension (does it matter?) they aren't displaying at all. I am using an Image control to display these images, and I'm not sure if it matters that the extension is not set (does the image control know the difference between an jpg and a tiff without the extension?) I am guessing the images aren't displaying because they are not on the same server MY_SERVER that the images are located (ANOTHER_SERVER). Any ideas on how to fix this?

    Read the article

  • Include Problem with Objective-C++ and OpenGL

    - by Stephen Furlani
    Hello, I feel silly asking this but I've searched for 'include problems' and have only come up with basic stuff. I'm working with an API that includes/imports in their header files (ARGH! HATE ANGER DESTRUCTION). One of these Obj-C files #import "OpenGL/CGLMacros.h" which #define's things like glMatrixMode(...); In my code I need the glMatrixMode(...); from #include "OpenGL/gl.h" but it won't access it! I can't edit the headers from the (poorly) coded API to put the includes in their definition files. What can I do? If the CGLMacros.h file starts out like /* Copyright: (c) 1999 by Apple Computer, Inc., all rights reserved. */ #ifndef _CGLMACRO_H #define _CGLMACRO_H Can I put a #define _CGLMACRO_H before I include the offending API header file? -Stephen

    Read the article

  • Restore Partioned database into multiple filegroups

    - by Renju
    does anyone have any query to restore partioned db that having multiple file groups,In the restore option in the SSME i need to edit manually all the path of the filegroups restore as option it little bit tedious as it having more than 150 filegroups eg:USE master GO -- First determine the number and names of the files in the backup. RESTORE FILELISTONLY FROM MyNwind_1 -- Restore the files for MyNwind. RESTORE DATABASE MyNwind FROM MyNwind_1 WITH NORECOVERY, MOVE 'MyNwind_data_1' TO 'D:\MyData\MyNwind_data_1.mdf', MOVE 'MyNwind_data_2' TO 'D:\MyData\MyNwind_data_2.ndf' GO -- Apply the first transaction log backup. RESTORE LOG MyNwind FROM MyNwind_log1 WITH NORECOVERY GO -- Apply the last transaction log backup. RESTORE LOG MyNwind FROM MyNwind_log2 WITH RECOVERY GO Here i need to specify multiple MOVE command for all my filegroups,this is a tedious task when having more than 100s of filegroups MOVE 'MyNwind_data_1' TO 'D:\MyData\MyNwind_data_1.mdf', MOVE 'MyNwind_data_2' TO 'D:\MyData\MyNwind_data_2.ndf' I need to move the files into the path i provided as a parameter.Please help. Regards Renju http://blog.renjucool.com

    Read the article

  • Excel 2010 Access to path is denied temp

    - by Chris Anderson
    I am using excel data reader to read data from an excel file. FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read); //1. Reading from a binary Excel file ('97-2003 format; *.xls) IExcelDataReader excelReader = ExcelReaderFactory.CreateBinaryReader(stream); //2. Reading from a OpenXml Excel file (2007 format; *.xlsx) IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(stream); http://exceldatareader.codeplex.com/ This reads excel 1997-2003 format and excel 2007 format on my local machine and when we move it to our test server. However, when moved to production, it works for excel 97-2003 files, but when I try to read 2007 files I receive the following error: Access to the path 'C:\Documents and Settings\PORTALS03\ASPNET\LOCALS~1\Temp\TMP_Z129388041687919815' is denied. How is it possible that the 97-2003 excel file can be read but the 2007 files throw access is denied?

    Read the article

  • Django admin interface upload failing on request data read error

    - by Jake
    Hi All, This is an updated version of an old question I asked. I've now done a lot more testing, plus the old question got hijacked. I'm getting a request data read error when trying to upload files to the Django admin interface. Files under about 150k work, but bigger files always fail and almost always at around 192k (that's 3 chunks) completed, sometimes at around 160k. The Exception I get is below. File "/usr/lib/python2.4/site-packages/django/http/multipartparser.py", line 405, in read return self._file.read(num_bytes) IOError: request data read error I've tried Chrome and Firefox on Windows and Firefox on Mac - Same results. I can upload to other sites so I don't think it's my connection. I'm running python 2.4, django 1.1, mod_wsgi, on CentOS (a media temple DV server) Locally it's fine (Django development server) Everything I've found on this issue says it's a mod_python issue and that changing to mod_wsgi will fix it, but I am running mod_wsgi. Can anyone help?

    Read the article

  • File not readable exception - pear/Config_Lite

    - by CasperNine
    I have two config files located in: /etc/svnauth and var/www/svnauth I have given read, write access to for both files like shown below chown -R apache:apache /etc/svnauth chmod -R 770 /etc/svnauth chown -R apache:apache /var/www/svnauth chmod -R 770 /var/www/svnauth When I try to read these two files using pear/Config_Lite, /etc/svnauth always fails. I can successfully read the /var/www/svnauth file Any reasons for this? What am I missing here Following is the error message i get: Fatal error: Uncaught exception 'Config_Lite_Exception_Runtime' with message 'file not readable: /etc/svnauth' in /var/www/html/svnmanager/Config/Lite.php:112 Stack trace: #0 /var/www/html/svnmanager/index.php(60): Config_Lite->read('/etc/svnauth') #1 {main} thrown in /var/www/html/svnmanager/Config/Lite.php on line 112

    Read the article

  • Examples of networked Flash games

    - by videodnd
    Maybe I am asking the wrong questions, because I don't see any sample projects out there. I know Flash developers have done Kiosks and renovated arcade games. "Come on, we see Flash everywhere." Is there a sample project I could be pointed towards, it would be an ass-saver. Can I prepare my swf files like an image gallery and receive XML commands to load it? Where do I start? Flash/After Effects skills have got me through so far, but I need help!!! It would be fun if it wasn't so stressful. Criteria TCP/IP socket connection Flash package XML commands load swf file in to a container Additional Questions How do I prepare my Flash files and XML sheet to receive commands "any sample out there"? What about e.data, urlLoad, xmlSocket Class, XMLCP/IP XML socket connection to load Is binary or XML method better for loading and reloading swf files? Do I need Red5 or a media server? videoDnd, Ambitious Development Noob

    Read the article

  • Ant delete task

    - by user315228
    Hi, I have several files with name abc* and i want to delete all those files. is it possible using ant task. For eg. my directory structure is: c:\ myapp\ abc.xml abc.txt abc-1.2.xml abc-abc.xml abcdef.xml pqr.xml xyz.xml abc\ so from this, i need to delete all abc* files. So if i use ant it should delete following: abc.xml abc.txt abc-1.2.xml abc-abc.xml abcdef.xml it should leave directory with abc* Can somebody help me. Almas

    Read the article

  • Problems with dotLess Stopping characters and hacks list?

    - by rDeeb
    Have any one run into trouble when running dotLess and having hacks on your CSS files? Been working on a project... just installed dotLess after one year of development to ease a little bit the job of creating new CSS files for some new functionality of the web site, and recently our old CSS is not working correctly. Viewing the resulting CSS files we realized that the dotLess compiler stopeed at some hacks like this one: html>/**/body #itemTable .informationView fieldset textarea { min-height: 1.3em; height: 1.3em; } So we were wondering if there is any list of stopping words or hacks for dotLess?

    Read the article

  • Difference and correct usage for /tmp and /var/tmp

    - by David
    I haven't put much thought into this until now, but it seems odd that there is a /var/tmp and /tmp directories for most of the linux distro's I routinely use ( Ubuntu, Centos, Redhat ). Is there any semantic difference between the two, like when whoever designed the first file system layout, he or she thought "Not all tmp file's are created equal!" The only difference I've found for centos, is that /tmp routinely scrubs out files older then 240 hours while /var/tmp holds onto stale files for 720 hours.

    Read the article

  • Codeigniter Routes for filename with extension

    - by thehuby
    I am using codeigniter and its routes system successfully with some lovely regexp, however I have come unstuck on what should be an easy peasy thing in the system. I want to include a bunch of search engine related files (for Google webmaster etc.) plus the robots.txt file, all in a controller. So, I have create the controller and updated the routes file and don't seem to be able to get it working with these files. Here's a snip from my routes file: $route['robots\.txt|LiveSearchSiteAuth\.xml'] = 'search_controller/files'; Within the function I use the URI helper to figure out which content to show. Now I can't get this to match, which points to my regexp being wrong. I'm sure this is a really obvious one but its late and my caffeine tank is empty :)

    Read the article

  • Linux filesystem with inodes close on the disk

    - by pts
    I'd like to make the ls -laR /media/myfs on Linux as fast as possible. I'll have 1 million files on the filesystem, 2TB of total file size, and some directories containing as much as 10000 files. Which filesystem should I use and how should I configure it? As far as I understand, the reason why ls -laR is slow because it has to stat(2) each inode (i.e. 1 million stat(2)s), and since inodes are distributed randomly on the disk, each stat(2) needs one disk seek. Here are some solutions I had in mind, none of which I am satisfied with: Create the filesystem on an SSD, because the seek operations on SSDs are fast. This wouldn't work, because a 2TB SSD doesn't exist, or it's prohibitively expensive. Create a filesystem which spans on two block devices: an SSD and a disk; the disk contains file data, and the SSD contains all the metadata (including directory entries, inodes and POSIX extended attributes). Is there a filesystem which supports this? Would it survive a system crash (power outage)? Use find /media/myfs on ext2, ext3 or ext4, instead of ls -laR /media/myfs, because the former can the advantage of the d_type field (see in the getdents(2) man page), so it doesn't have to stat. Unfortunately, this doesn't meet my requirements, because I need all file sizes as well, which find /media/myfs doesn't print. Use a filesystem, such as VFAT, which stores inodes in the directory entries. I'd love this one, but VFAT is not reliable and flexible enough for me, and I don't know of any other filesystem which does that. Do you? Of course, storing inodes in the directory entries wouldn't work for files with a link count more than 1, but that's not a problem since I have only a few dozen such files in my use case. Adjust some settings in /proc or sysctl so that inodes are locked to system memory forever. This would not speed up the first ls -laR /media/myfs, but it would make all subsequent invocations amazingly fast. How can I do this? I don't like this idea, because it doesn't speed up the first invocation, which currently takes 30 minutes. Also I'd like to lock the POSIX extended attributes in memory as well. What do I have to do for that? Use a filesystem which has an online defragmentation tool, which can be instructed to relocate inodes to the the beginning of the block device. Once the relocation is done, I can run dd if=/dev/sdb of=/dev/null bs=1M count=256 to get the beginning of the block device fetched to the kernel in-memory cache without seeking, and then the stat(2) operations would be fast, because they read from the cache. Is there a way to lock those inodes and/or blocks into memory once they have been read? Which filesystem has such a defragmentation tool?

    Read the article

< Previous Page | 434 435 436 437 438 439 440 441 442 443 444 445  | Next Page >