Search Results

Search found 17233 results on 690 pages for 'download speed'.

Page 12/690 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • can router configuration cause decreasing of download rate?

    - by Behrooz
    my download speed got crazy since i changed the routers IP. but nothing got fixed after "reset factory"ing it. the speed was 1024kb/s(128kB/s) but it is 200kb/s(max) right now. i mean it works good if a request is small(i.e. a HTTP request) but it gets slow if a request has a big response. help me please(it is three days I'm downloading VS2010) http://serverfault.com/questions/135243/ no one on serverfault helped me for posting my question please migrate it to serverfault. thanks.

    Read the article

  • How to know which speed of RAM can I use

    - by Phuong Nguyen
    I have a Dell Vostro 1000 which uses ATI RS480 mainboard. Specification says that the front bus of such mainboard is 1Ghz. However, the default RAM came with mainboard is PC4200 (533Mhz). I wonder if I can replace them with PC6400 (800Mhz) or at least PC5300 (667Mhz)?

    Read the article

  • Improve speed of "start menu" in Linux Mint 10 - Ubuntu 10.10 derivative

    - by Gabriel L. Oliveira
    I have a global menu (including application, administration and system tabs) that is taking too much time (for me) to load (about 2.5 seconds). Of course, this time is taken only during first start. After it have loaded, next times are better ( less than 0.2 miliseconds) The menu was taking more time before (about 5 seconds), and I found that was because of the 'Other' part of the menu, that included many applications installed with Wine, so I removed all of them (I didn't need them at all). I have a "normal" knowledge of programming, and I think that the process of starting the menu for the first time has some kind of "cache function", that tries to find which apps are present that need to be placed under menu to be shown to user. But didn't found this function so that I could analyze in details what he is doing (if searching for files under "~/.local/share/applications" or anything else). Also, I found that hitting "Alt-F2" also fires this "cache function", because after waiting it to load, the process of opening the menu took less than 0.2 miliseconds. So, could anyone help me in order to reduce this time? I found on internet that some user could reduce the time by resizing the icons of applications. But found here that most of my icons are already at 25x25 size. Any other idead? Maybe a multiprocess to load it, or include it under startup... don't know. Ps: Sorry if this is an awkward question, but I just do not like waiting for things to happen, and think that this process should be smoother than it's now. Also, thanks in advance!

    Read the article

  • How to check DVD integrity at max read speed of DVD writer

    - by ashishsony
    I need to check the integrity of burned DVDs so that I can be sure about my backed-up data. I use DL-DVDs to take the backup. Earlier I used VSO Inspector software for the same but the day I switched to DL-DVDs the VSO Inspector gives me errors upon checking. I think the errors are because the switching of layer writing involves some dummy data somewhere. Secondly, it's damned slow for checking. I believe if there is a utility that can read all files (not the disk surface) and report if some files are unreadable would do the job. But it should be quick! Nobody wants to sit for disk checking for 3-4 hours after a quick 30 min data burn! I am looking for such a utility on Windows or Linux. Even scripts (python, etc) will do. I just want to be assured that the data is safe. Can someone help me in this? Thanks.

    Read the article

  • Server speed: sharing one script.php or using many copies the same script.php

    - by Marco Demaio
    Let's assume: I have thousands of domains on same Apache server. Each domain is in a folder under server public_html document folder, so it can be accessed by calling "www.somedomain.com" or by calling "www.serverdomain.com/somedomain_folder" In each domain there is a website who needs a certain script.php (identical for each domain). From a coding point view, its obvious that it's better to use a unique script.php, so when i update it with new features/bug fixes etc, I need to update on server only one file and it will work for all domains. But from a server point of view? If i use a unique script all domains will access it at the same time, will the server run slower compared to the situation where each domain called its own script?

    Read the article

  • why is drop box syncing so freakishly slowly in my linux virtual machine?

    - by Bec
    i am setting up a linux virtual machine (windows 7 64 bit host, ubuntu 64 bit guest, using virtual box) and i just installed drop box and set it to sync. I've only got about 2Gb in there so i figured it should take just an afternoon, but it's going at about 0.5 kB/second and says it will take about 60 days. I usually get about 200 kB/second in the host OS, and downloading straight from the dropbox website through firefox in the ubuntu VM i get about that, but sync is really slow. any tips?

    Read the article

  • Harddrive speed drops a lot!

    - by AZ
    The hard drive is used to do BT with uTorrent. Recently uTorrent began to report there is a "I/O Device error", then I use HD Tune to test it, it turns out the transfer rate is only 16 MB/sec. At the same time, I have a similar hard drive tested, it rates as 120 MB/sec. They are both 7200rpm desktop hard drives. I used chkdsk to fix it but the rate didn't change. Is this a symptom of hard drive failure? Should I backup the content on the disk ASAP or is there any other tool can fix it or diagnosis it?`

    Read the article

  • Download - Upload is too slow on Centos

    - by Mehdi
    My download/upload in server and out of server is too slow (around 50 KB/s !) ! Did I miss some configuration ? Some information: CentOS release 6.3 uptime load average: 0.17, 0.32, 0.37 Memory free -m total used free shared buffers cached Mem: 24009 21988 2021 0 806 18098 -/+ buffers/cache: 3083 20926 Swap: 4095 28 4067 lshw -C network *-network description: Ethernet interface product: 82574L Gigabit Network Connection vendor: Intel Corporation physical id: 0 bus info: pci@0000:02:00.0 logical name: eth0 version: 00 serial: 00:25:90:70:17:4a size: 100MB/s capacity: 1GB/s width: 32 bits clock: 33MHz capabilities: pm msi pciexpress msix bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=off broadcast=yes driver=e1000e driverversion=1.9.5-k duplex=full firmware=2.1-2 ip=108.175.8.123 latency=0 link=yes multicast=yes port=twisted pair speed=100MB/s resources: irq:16 memory:fb900000-fb91ffff ioport:e000(size=32) memory:fb920000-fb923fff ethtool ethtool eth0 Settings for eth0: Supported ports: [ TP ] Supported link modes: 10baseT/Half 10baseT/Full 100baseT/Half 100baseT/Full 1000baseT/Full Supports auto-negotiation: Yes Advertised link modes: Not reported Advertised pause frame use: No Advertised auto-negotiation: No Speed: 100Mb/s Duplex: Full Port: Twisted Pair PHYAD: 1 Transceiver: internal Auto-negotiation: off MDI-X: off Supports Wake-on: pumbg Wake-on: g Current message level: 0x00000001 (1) Link detected: yes dmesg |grep e1000e dmesg |grep e1000e e1000e: Intel(R) PRO/1000 Network Driver - 1.9.5-k e1000e: Copyright(c) 1999 - 2012 Intel Corporation. e1000e 0000:02:00.0: Disabling ASPM L0s e1000e 0000:02:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 e1000e 0000:02:00.0: setting latency timer to 64 e1000e 0000:02:00.0: irq 33 for MSI/MSI-X e1000e 0000:02:00.0: irq 34 for MSI/MSI-X e1000e 0000:02:00.0: irq 35 for MSI/MSI-X e1000e 0000:02:00.0: eth0: (PCI Express:2.5GT/s:Width x1) 00:25:90:70:17:4a e1000e 0000:02:00.0: eth0: Intel(R) PRO/1000 Network Connection e1000e 0000:02:00.0: eth0: MAC: 3, PHY: 8, PBA No: FFFFFF-0FF e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e 0000:02:00.0: eth0: Unsupported Speed/Duplex configuration e1000e: eth0 NIC Link is Up 10 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e 0000:02:00.0: Disabling ASPM L1 e1000e 0000:02:00.0: eth0: changing MTU from 1500 to 9000 e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO e1000e: eth0 NIC Link is Up 100 Mbps Full Duplex, Flow Control: None e1000e 0000:02:00.0: eth0: 10/100 speed: disabling TSO

    Read the article

  • Download YouTube Videos the Easy Way

    - by Trevor Bekolay
    You can’t be online all the time, and despite the majority of YouTube videos being nut-shots and Lady Gaga parodies, there is a lot of great content that you might want to download and watch offline. There are some programs and browser extensions to do this, but we’ve found that the easiest and quickest method is a bookmarklet that was originally posted on the Google Operating System blog (it’s since been removed). It will let you download standard quality and high-definition movies as MP4 files. Also, because it’s a bookmarklet, it will work on any modern web browser, and on any operating system! Installing the bookmarket is easy – just drag and drop the Get YouTube video link below to the bookmarks bar of your browser of choice. If you’ve hidden the bookmark bar, in most browsers you can right-click on the link and save it to your bookmarks. Get YouTube video   With the bookmarklet available in your browser, go to the YouTube video that you’d like to download. Click on the Get YouTube video link in your bookmarks bar, or in the bookmarks menu, wherever you saved it earlier. You will notice some new links appear below the description of the video. If you download the standard definition file, it will save as “video.mp4” by default. However, if you download the high definition file, it will save with the same name as the title of the video. There are many methods of downloading YouTube videos…but we think this is the easiest and quickest method. You don’t have to install anything or use up resources, but you can still get a link to download an MP4 with one click. Do you use a different method to download Youtube videos? Let us know about it in the comments! javascript:(function(){if(document.getElementById(’download-youtube-video’))return;var args=null,video_title=null,video_id=null,video_hash=null;var download_code=new Array();var fmt_labels={‘18′:’standard%20MP4′,’22′:’HD%20720p’,'37′:’HD%201080p’};try{args=yt.getConfig(’SWF_ARGS’);video_title=yt.getConfig(’VIDEO_TITLE’)}catch(e){}if(args){var fmt_url_map=unescape(args['fmt_url_map']);if(fmt_url_map==”)return;video_id=args['video_id'];video_hash=args['t'];video_title=video_title.replace(/[%22\'\?\\\/\:\*%3C%3E]/g,”);var fmt=new Array();var formats=fmt_url_map.split(’,');var format;for(var i=0;i%3Cformats.length;i++){var format_elems=formats[i].split(’|');fmt[format_elems[0]]=unescape(format_elems[1])}for(format in fmt_labels){if(fmt[format]!=null){download_code.push(’%3Ca%20href=\”+(fmt[format]+’&title=’+video_title)+’\'%3E’+fmt_labels[format]+’%3C/a%3E’)}elseif(format==’18′){download_code.push(’%3Ca%20href=\’http://www.youtube.com/get_video?fmt=18&video_id=’+video_id+’&t=’+video_hash+’\'%3E’+fmt_labels[format]+’%3C/a%3E’)}}}if(video_id==null||video_hash==null)return;var div_embed=document.getElementById(’watch-embed-div’);if(div_embed){var div_download=document.createElement(’div’);div_download.innerHTML=’%3Cbr%20/%3E%3Cspan%20id=\’download-youtube-video\’%3EDownload:%20′+download_code.join(’%20|%20′)+’%3C/span%3E’;div_embed.appendChild(div_download)}})() Similar Articles Productive Geek Tips Watch YouTube Videos in Cinema Style in FirefoxDownload YouTube Videos with Cheetah YouTube DownloaderStop YouTube Videos from Automatically Playing in FirefoxImprove YouTube Video Viewing in Google ChromeConvert YouTube Videos to MP3 with YouTube Downloader TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family Amazon Free Kindle for PC Download Stretch popurls.com with a Stylish Script (Firefox) OldTvShows.org – Find episodes of Hitchcock, Soaps, Game Shows and more Download Microsoft Office Help tab

    Read the article

  • Is it possible to download extremely large files intelligently or in parts via SSH from Linux to Windows?

    - by Andrew
    I have a ~35 GB file on a remote Linux Ubuntu server. Locally, I am running Windows XP, so I am connecting to the remote Linux server using SSH (specifically, I am using a Windows program called SSH Secure Shell Client version 3.3.2). Although my broadband internet connection is quite good, my download of the large file often fails with a Connection Lost error message. I am not sure, but I think that it fails because perhaps my internet connection goes out for a second or two every several hours. Since the file is so large, downloading it may take 4.5 to 5 hours, and perhaps the internet connection goes out for a second or two during that long time. I think this because I have successfully downloaded files of this size using the same internet connection and the same SSH software on the same computer. In other words, sometimes I get lucky and the download finishes before the internet connection drops for a second. Is there any way that I can download the file in an intelligent way -- whereby the operating system or software "knows" where it left off and can resume from the last point if a break in the internet connection occurs? Perhaps it is possible to download the file in sections? Although I do not know if I can conveniently split my file into multiple files -- I think this would be very difficult, since the file is binary and is not human-readable. As it is now, if the entire ~35 GB file download doesn't finish before the break in the connection, then I have to start the download over and overwrite the ~5-20 GB chunk that was downloaded locally so far. Do you have any advice? Thanks.

    Read the article

  • Tom Cruise: Meet Fusion Apps UX and Feel the Speed

    - by ultan o'broin
    Unfortunately, I am old enough to remember, and now to admit that I really loved, the movie Top Gun. You know the one - Tom Cruise, US Navy F-14 ace pilot, Mr Maverick, crisis of confidence, meets woman, etc., etc. Anyway, one of more memorable lines (there were a few) was: "I feel the need, the need for speed." I was reminded of Tom Cruise recently. Paraphrasing a certain Senior Vice President talking about Oracle Fusion Applications and user experience at an all-hands meeting, I heard that: Applications can never be too easy to use. Performance can never be too fast. Developers, assume that your code is always "on". Perfect. You cannot overstate the user experience importance of application speed to users, or at least their perception of speed. We all want that super speed of execution and performance, and increasingly so as enterprise users bring the expectations of consumer IT into the work environment. Sten Vesterli (@stenvesterli), an Oracle Fusion Applications User Experience Advocate, also addressed the speed point artfully at an Oracle Usability Advisory Board meeting in Geneva. Sten asked us that when we next Googled something, to think about the message we see that Google has found hundreds of thousands or millions of results for us in a split second (for example, About 8,340,000 results (0.23 seconds)). Now, how many results can we see and how many can we use immediately? Yet, this simple message communicating the total results available to us works a special magic about speed, delight, and excitement that Google has made its own in the search space. And, guess what? The Oracle Application Development Framework table component relies on a similar "virtual performance boost", says Sten, when it displays the first 50 records in a table, and uses a scrollbar indicating the total size of the data record set. The user scrolls and the application automatically retrieves more records as needed. Application speed and its perception by users is worth bearing in mind the next time you're at a customer site and the IT Department demands that you retrieve every record from the database. Just think of... Dave Ensor: I'll give you all the rows you ask for in one second. If you promise to use them. (Again, hat tip to Sten.) And then maybe think of... Tom Cruise. And if you want to read about the speed of Oracle Fusion Applications, and what that really means in terms of user productivity for your entire business, then check out the Oracle Applications User Experience Oracle Fusion Applications white papers on the usable apps website.

    Read the article

  • ASP.NET MVC FileContentResult IE 8.0 hangs on download

    - by marc.d
    some of my users are expieriencing problems when they try to download a report, the download just hangs on 0%, restarting IE usually fixes the problem. why does this happen? i am using ASP.NET MVC (v1), the my action looks like this <Authorize()> _ <AcceptVerbs(HttpVerbs.Get)> _ Function RenderReport(ByVal guid As Guid, ByVal anonym As Boolean) As FileContentResult ... Dim mimeType As String = String.Empty Dim renderedBytes() As Byte = EmployeePresentation.Render(guid, mimeType, Server.MapPath("~/Reports/..."), anonym) Return File(renderedBytes, mimeType, filename) End Function the filename is US-ASCII encoded, filesize is usally around 300Kb, mimeType is application/pdf tia

    Read the article

  • ASP.NET Grid AjaxPanel Download Issue

    - by Mahesh
    Hi, I have a telerik grid which is performing operations like searching,sorting,filtering etc. To make customers happy, we put this control in an ajax panel for seamless experience. Now, we added a new functionality to the grid where the customer can download the entire row information as a csv file. As the response is a file, ajax panel is trying to parse the output and throwing the following exception: Microsoft JScript runtime error: Sys.WebForms.PageRequestManagerParserErrorException: The message received from the server could not be parsed. Common causes for this error are when the response is modified by calls to Response.Write(), response filters, HttpModules, or server trace is enabled. Details: Error parsing near '?'. Could you please help me in having both functionalities( Ajax and Download) in place without any error?? Thanks, Mahesh

    Read the article

  • href image link download on click

    - by Piero
    Hey all, I generate normal links like: <a href="/path/to/image"><img src="/path/to/image" /></a> in a web app. When I click on the link, it displays the picture in a new page. If you want to save the picture, then you need to right click on it and select "save as" I don't want this behaviour, I would like to have a download box popping out when I click on the link, is that possible just with html or javascript? How? If not I guess I would have to write a download.php scrit and call it into the href with the file name as parameter...? Greetings! :)

    Read the article

  • Download a file from one ASP.NET web application to other (given the credentials)

    - by Tom S.
    Hi everybody! Im working on a asp.net 3.5 web application (C#), where i have a file with some information that is updated frequently, and only few accounts can access to it (the application is using the asp.net authentication system, stored in a SQL database). My task is to parse that file, so i made a small parser (another web app) a to show the information in a more friendly way. However, everytime i want to parse it, i need to enter in the application with one of those accounts, download the file, put in the parser's folder. Is there any way to, given the username and password, download the file directly from the parser application and use that one? Thanks in advance

    Read the article

  • Large File Download - Connection With Server Reset

    - by daveywc
    I have an asp.net website that allows the user to download largish files - 30mb to about 60mb. Sometimes the download works fine but often it fails at some varying point before the download finishes with the message saying that the connection with the server was reset. Originally I was simply using Server.TransmitFile but after reading up a bit I am now using the code posted below. I am also setting the Server.ScriptTimeout value to 3600 in the Page_Init event. private void DownloadFile(string fname, bool forceDownload) { string path = MapPath(fname); string name = Path.GetFileName(path); string ext = Path.GetExtension(path); string type = ""; // set known types based on file extension if (ext != null) { switch (ext.ToLower()) { case ".mp3": type = "audio/mpeg"; break; case ".htm": case ".html": type = "text/HTML"; break; case ".txt": type = "text/plain"; break; case ".doc": case ".rtf": type = "Application/msword"; break; } } if (forceDownload) { Response.AppendHeader("content-disposition", "attachment; filename=" + name.Replace(" ", "_")); } if (type != "") { Response.ContentType = type; } else { Response.ContentType = "application/x-msdownload"; } System.IO.Stream iStream = null; // Buffer to read 10K bytes in chunk: byte[] buffer = new Byte[10000]; // Length of the file: int length; // Total bytes to read: long dataToRead; try { // Open the file. iStream = new System.IO.FileStream(path, System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read); // Total bytes to read: dataToRead = iStream.Length; //Response.ContentType = "application/octet-stream"; //Response.AddHeader("Content-Disposition", "attachment; filename=" + filename); // Read the bytes. while (dataToRead > 0) { // Verify that the client is connected. if (Response.IsClientConnected) { // Read the data in buffer. length = iStream.Read(buffer, 0, 10000); // Write the data to the current output stream. Response.OutputStream.Write(buffer, 0, length); // Flush the data to the HTML output. Response.Flush(); buffer = new Byte[10000]; dataToRead = dataToRead - length; } else { //prevent infinite loop if user disconnects dataToRead = -1; } } } catch (Exception ex) { // Trap the error, if any. Response.Write("Error : " + ex.Message); } finally { if (iStream != null) { //Close the file. iStream.Close(); } Response.Close(); } }

    Read the article

  • php download file slows

    - by hobbywebsite
    OK first off thanks for your time I wish I could give more than one point for this question. Problem: I have some music files on my site (.mp3) and I am using a php file to increment a database to count the number of downloads and to point to the file to download. For some reason this method starts at 350kb/s then slowly drops to 5kb/s which then the file says it will take 11hrs to complete. BUT if I go directly to the .mp3 file my browser brings up a player and then I can right click and "save as" which works fine complete download in 3mins. (Yes both during the same time for those that are thinking it's my connection or ISP and its not my server either.) So the only thing that I've been playing around with recently is the php.ini and the .htcaccess files. So without further ado, the php file, php.ini, and the .htcaccess: download.php <?php include("config.php"); include("opendb.php"); $filename = 'song_name'; $filedl = $filename . '.mp3'; $query = "UPDATE songs SET song_download=song_download+1 WHER song_linkname='$filename'"; mysql_query($query); header('Content-Disposition: attachment; filename='.basename($filedl)); header('Content-type: audio/mp3'); header('Content-Length: ' . filesize($filedl)); readfile('/music/' . $filename . '/' . $filedl); include("closedb.php"); ?> php.ini register_globals = off allow_url_fopen = off expose_php = Off max_input_time = 60 variables_order = "EGPCS" extension_dir = ./ upload_tmp_dir = /tmp precision = 12 SMTP = relay-hosting.secureserver.net url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=,fieldset=" ; Defines the default timezone used by the date functions date.timezone = "America/Los_Angeles" .htaccess Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} !^(www.MindCollar.com)?$ [NC] RewriteRule (.*) http://www.MindCollar.com/$1 [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ErrorDocument 404 /errors/404.php ErrorDocument 403 /errors/403.php ErrorDocument 500 /errors/500.php </IfModule> Options -Indexes Options +FollowSymlinks <Files .htaccess> deny from all </Files> thanks for you time

    Read the article

  • Use of bit-torrent for large file download as an alternative to FTP

    - by questzen
    The company I work for procures large volumes of data and does this by subscribing to FTP locations. I was wondering if it is possible to download the same using a tracker, the major challenge is authentication of the users IMO. Most ftp servers we subscribe to have a restriction of the number of ftp connection attempts. Does any one here have any experience with this? Any advice is welcome. Edit To clarify, we subscribe to third party vendors and access their ftp location using credentials provided by them. The service is not exclusive to us, they do sell their data to several others. If we could be part of the swarm, the download rates would be pretty high without added penalty. The question is about the possibility of achieving this, so that we can put-forth a proposal in those lines. The vendors obviously wouldn't share data to non-subscribers, so that is a constraint.

    Read the article

  • Force download menu for remote files

    - by o-logn
    Hey, I would like users to upload links on my site. When another user clicks on the link (e.g. PDF file), then I would like the download popup to show instead of actually displaying the PDF in browser. I know I can use Response.AddHeader/Response.WriteFile to achieve this, but the WriteFile method required a virtual path. However, the links uploaded by the user will be pointing to external servers. Can I still force the download popup to show and, if so, what would be the most efficient way of doing it? Thanks for any advice

    Read the article

  • iOS Downloading Videos and saving in Application Support folder

    - by Satyam svv
    In my application, i've to download videos around 10 to my application and play accordingly. Each video is around 50 MB. I'm using following code and then after downloading the video, i'm saving it to Application support folder to avoid icloud sync. But the problem is that when downloading the videos its crashing. [NSURLConnection sendAsynchronousRequest:req queue:[[NSOperationQueue alloc] init] completionHandler:^(NSURLResponse *response, NSData *rcvdDat, NSError * err) { . . . } What I'm thinking is that, while downloading the video, it resides in memory and so the total memory occupying by the app is increasing. Finally iOS is making the app to close. I would like to download the video and when ever a stream of data received, write to temp file and when completes move it to application support folder. Can some one help me on how to write it to file and save it at the end? I cannot use 3rd party libraries (unless its small) due to legal issues.

    Read the article

  • File download using Java, Struts 2 and AJAX

    - by amar4kintu
    Hello Friends, I want to give file download using java,struts2 and ajax. On my html page there is a button called "export" clicking on which ajax call will be made which will execute a query and will create .xls file using code and I want to give that file for download to user without storing it on hard drive. Does any one know how to do that using struts2 and ajax in java? Is there any example available? Let me know if you need more details from me... Thanks. amar4kintu

    Read the article

  • download authentication?

    - by Sahat
    Hi I am sorry if this question has been asked before but I am looking for some sort of download authentication. In other words if I am going to give the user a link to a file, I want to make sure only that person will get it, and get it only once! Is there a simple solution without setting up the whole database. Even better if it's possible to have an ecrypted web link that will let you download a file from my FTP server just once, after that the link becomes invalid. Thanks.

    Read the article

  • Download the ‘Getting Started with Ubuntu 12.10' Manual for Free

    - by Asian Angel
    Today is the official release date for Ubuntu’s latest version, so why not download the manual to go with it? This free manual is available to view online or download as a 145 page PDF file to best suits your needs. The home page for the manual will display a large Download Button, but the best option is to click on the Alternative Download Options link. Clicking on the Alternative Download Options link will let you select the language version you want, choose a system version, and let you download the manual directly or view it online. What To Do If You Get a Virus on Your Computer Why Enabling “Do Not Track” Doesn’t Stop You From Being Tracked HTG Explains: What is the Windows Page File and Should You Disable It?

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >