Search Results

Search found 28217 results on 1129 pages for 'website download'.

Page 12/1129 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Ways to go about optimizing website performance WordPress, Amazon EC2 Apache and RDS MySQL

    - by fuzzybee
    I have 6 WordPress websites running on 1 single EC2 instance. All the the websites are connecting to databases in 1 same RDS instance. Earlier today, traffic to the largest website peaked and the RDS instance went bottle-neck - CPU utilization was 100% for over an hour. It affected all of my websites as it took them all forever to load. In order to prevent such issue from happening again, which of the following will matter most so that I invest time and effort in first of all? (I will work on all later, I just need to prioritise now) To improve caching for all websites To fine-tune the database server To fine-tune my Apache server What will be the effect on user experience for my websites? Some quick searches show that I should limit number of concurrent connections to my web server but wouldn't that prevent users from accessing my websites? More background: My largest website has 140k visits and 660k page views a month. The other 5 websites should add up much less than that. I'm using a large EC2 instance as the web server I'm using a medium RDS instance as the database server What I've already done: Use W3 Total Cache plugin for caching for most the websites, especially the largest one (I can barely anything else in terms of caching I could do for the largest website) Am I using my resources wastefully or is there simply not enough resources for my websites - or rather, how do I answer that question myself?

    Read the article

  • Software to automate website screenshot capture

    - by Leniel Macaferi
    Do you know any software that can automate the process of getting screenshots of every page of a website? It would act like a spider/crawler/robot. You name it... For example: I developed a website and now I'd like to get a screenshot of every page of the site. I of course could do it manually (a lot of work). For each module of the site (Student, Payment, etc) I have different pages (Create, Edit, Details, Delete, etc) forms. The thing I'm looking for is a software that can visit every link of the site and then capture the screen - a software that can automate the whole process. It would also be good if the software allowed the user to pass a list of URLs to capture screenshots allowing even more fine grained configuration. EDIT: I tried Selenium mentioned by Aaron in his answer but I managed to find an app that does exactly what I needed. It's called Paparazzi!. I wrote a blog post to showcase my attempt at Selenium and the findings regarding Paparazzi!'s batch capture functionality: Software to automate website screenshot capture

    Read the article

  • Silverlight File download from client

    - by luke
    I have a Silverlight application that implements some basic CRUD operations on a fairly flat data set. The application loads all the data onto the client to allow for quick editing (this is a fairly small data set no more that a couple K). I would like to allow them to download the file as a CSV so they can edit the data locally. I know i can set up a HyperLink button to a URL on my webserver and then server the data dynamically using a custom server handler. But this seems kind of roundabout to me because the all the data is already all on the Client's machine (because the Silverlight application loaded it). So i was wondering if there was a way to prompt the user for a file download and then dynamically generate the file download stream from Silverlight?

    Read the article

  • how to display a dialog if user can't download file

    - by tom smith
    hi. dealing with php/html/javascript. i'm trying to figure out a good/best approach to allowing a user to download a file. i can have the traditional href link, that interfaces with the back end php app to download the file. however, i want to have the app display some sort of dialog/alert if the user isn't able (basedon acl/permissions) to download the file... does this have to ba an ajax thing, as i don't want to do a page refresh... thoughts/comments/pointers to code samples are appreciated. thanks. -tom

    Read the article

  • Blank installNotifyURI in OMA Download Descriptors

    - by joynes
    People can download content (music, images) for their mobile from my server. Im trying to use the installNotifyURI-tag of the download descriptors specified by OMA, to be able to find out if the download has been successful. When the user has downloaded the item I do get a POST to the url I specified in the installNotifyURI-tag. <installNotifyURIhttp://joynes.se/mytest.php</installNotifyURI Although I never get any status code. The POST is just blank. Anyone who knows about this problem? /Br Johannes

    Read the article

  • Download attachment issue with IE6-8 - non ssl

    - by Arun P Johny
    I'm facing an issue with file download with IE6-8 in non ssl environment. I've seen a lot of articles about the IE attachment download issue with ssl. As per the articles I tried to set the values of Pragma, Cache-Control headers, but still no luck with it. These are my response headers Cache-Control: private, max-age=5 Date: Tue, 25 May 2010 11:06:02 GMT Pragma: private Content-Length: 40492 Content-Type: application/pdf Content-Disposition: Attachment;Filename="file name.pdf" Server: Apache-Coyote/1.1 I've set the header values after going through some of these sites KB 812935 KB 316431 But these items are related to SSL. I've checked the response body and headers using fiddler, the response body is proper. I'm using window.open(url, "_blank") to download the file, if I change it to window.open(url, "_parent") or change the "Content-Disposition" to 'inline;Filename="file name.pdf"' it works fine. Please help me to solve this problem

    Read the article

  • Flex Datagrid File Download old value

    - by Vish
    Hi, We use a AIR client and WAMP server. Got a weird issue with Flex datagrid. When I download an image by double clicking on a datagrid row, the file gets downloaded correctly. But when I go on to select another row for download, even though the correct parameters are being passed, it again prompts download of old file. Upon cancelling that and trying again, it picks up the selected row. Not sure how to fix this issue, below is the datagrid code I am using, <mx:DataGrid x="10" y="10" id="ourDataGrid" visible="true" enabled="true" dataProvider="{fileList}" verticalScrollPolicy="on" selectionColor="#B7C6E7" itemClick="downloadFile(event)"> <mx:columns> <mx:DataGridColumn headerText="Name" dataField="@name" /> <mx:DataGridColumn headerText="LastName" dataField="@lastname" /> <mx:DataGridColumn headerText="FirstName" dataField="@firstname" /> </mx:columns> </mx:DataGrid>

    Read the article

  • How to prevent Google Website Optimizer from making Google Analytics spike Direct Traffic and lower Bounce Rate?

    - by Scott
    I am using Google Website Optimizer (GWO) and Google Analytics. Whenever a person (Google Website Optimizer) does a javascript redirect, Google Analytics will change the referrer. When the referrer changes, the traffic source becomes yourself and is changed to Direct Traffic. For Example: A visitor goes to google: searches for my great service. Clicks the link that goes to website page: /home/ At this point, Google Analytics tracks the referrer as Google. However, /home/ has a GWO javacript redirect to a battery of A/B tests. /home-1/ or /home-2/ or /home-3/ When the redirect from /home/ occurs to /home-1/, google analytics on the /home-1/ page now thinks the referrer is yourself and converts the referrer to Direct Traffic since the Direct Traffic bucket is the unknown. I'm really surprised that GWO and GA do this when they both come from google. Now, How does one fix this to prevent the overwrite of the referrer using GWO?

    Read the article

  • Is Paypal the best solution for payment gateway for a website?

    - by Pennf0lio
    I have a realty website that needs a payment gateway for their property reservation. The reservation fee range from $500-$600 and about 5-6 people per month. I was wondering if Paypal is the best solution for accepting Payment. What will be the Pros and Cons using Paypal. Paypal was my first choice because It's easy to integrate on my existing website and I wouldn't be minding so much on the security. P.S. It's not a part of the question, But If you can site some realty website that accept payment and would be a good inspiration. It would be highly appreciated. Thanks!

    Read the article

  • Troubleshooting Website problems within the local network

    - by HaydnWVN
    Have an external website which opens fine on some PC's, yet seems to time out (or symptoms of timing out, but never actually does) on others. Seems to only affect (some) of our newer HP Pro 3305 MT Workstations. All of which are running Win7 32bit SP1 with all updates. Older PC's (Win7 32bit SP1 & WinXP) are unaffected. Using Google Chrome & Firefox makes no difference. Opening the website in IE9 Compatibility Mode has exactly the same symptoms. All PC's are on the same local network (Workgroup) using the same DNS server & gateway (inhouse) on the same internet connection, on the same subnet. There is no proxy server, no content filtering, no load balancing etc etc. Only group policy in effect (locally) is for Update scheduling. Local firewalls are all the same (Kaspersky WP4) and our external facing firewall has no IP specific settings. I have no control over the external website, traceroute shows the same destination on all PC's. It is a fairly popular website in our industry (Horticulture) and i'm not aware of any other people (even other sites within our sister companies) with the same problem. Update: Used Fiddler2 to monitor the HTTP request, seems its not getting fulfilled for some reason?! Request sent: GET http://www.rhs.org.uk/ HTTP/1.1 Host: www.rhs.org.uk Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.47 Safari/536.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-GB,en-US;q=0.8,en;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Log from Fiddler 2 of the request: This session is not yet complete. Press F5 to refresh when session is complete for updated statistics. Request Count: 1 Bytes Sent: 567 (headers:567; body:0) Bytes Received: 0 (headers:0; body:0) ACTUAL PERFORMANCE -------------- ClientConnected: 17:02:33.720 ClientBeginRequest: 17:02:39.118 GotRequestHeaders: 17:02:39.118 ClientDoneRequest: 17:02:39.118 Determine Gateway: 0ms DNS Lookup: 0ms TCP/IP Connect: 46ms HTTPS Handshake: 0ms ServerConnected: 17:02:39.165 FiddlerBeginRequest: 17:02:39.165 ServerGotRequest: 17:02:39.165 ServerBeginResponse: 00:00:00.000 GotResponseHeaders: 00:00:00.000 ServerDoneResponse: 00:00:00.000 ClientBeginResponse: 00:00:00.000 ClientDoneResponse: 00:00:00.000 RESPONSE BYTES (by Content-Type) -------------- ~headers~: 0 Log of a successful request from a working PC (done this morning, excuse the timestamps being different from above): Request Count: 1 Bytes Sent: 493 (headers:493; body:0) Bytes Received: 20,413 (headers:525; body:19,888) ACTUAL PERFORMANCE -------------- ClientConnected: 08:22:47.766 ClientBeginRequest: 08:22:47.766 GotRequestHeaders: 08:22:47.766 ClientDoneRequest: 08:22:47.766 Determine Gateway: 0ms DNS Lookup: 26ms TCP/IP Connect: 30ms HTTPS Handshake: 0ms ServerConnected: 08:22:47.828 FiddlerBeginRequest: 08:22:47.828 ServerGotRequest: 08:22:47.828 ServerBeginResponse: 08:22:48.905 GotResponseHeaders: 08:22:48.905 ServerDoneResponse: 08:22:48.905 ClientBeginResponse: 08:22:48.905 ClientDoneResponse: 08:22:48.905 Overall Elapsed: 00:00:01.1388020 RESPONSE BYTES (by Content-Type) -------------- text/html: 19,888 ~headers~: 525 So my question has evolved into: What is the difference between the 2 requests and how do I determine why 1 PC is not getting a reply to it's GET request?

    Read the article

  • What is the typical example of old school website design ?

    - by Pierre 303
    I want to build a website for a retro thing that was popular in the mid 90s (beginning of the commercial internet). So I want use old designs that was very popular at that time. The first thing that comes to my mind was those "under construction" animated gifs. People often put animated gifs everywhere. But also those awful repeating backgrounds. So yes, I want my website to look exactly like in the mid nineties ;) (please suggest practical and usable features, I guess an Java Applet menu would not work today, or saying on the bottom that this website is optimized for Netscape 3) EDIT: for those that wants to see the result: Retrology

    Read the article

  • What is the typical example of old school website design?

    - by Pierre 303
    I want to build a website for a retro thing that was popular in the mid 90s (beginning of the commercial internet). So I want use old designs that was very popular at that time. The first thing that comes to my mind was those "under construction" animated gifs. People often put animated gifs everywhere. But also those awful repeating backgrounds. So yes, I want my website to look exactly like in the mid nineties ;) (please suggest practical and usable features, I guess an Java Applet menu would not work today, or saying on the bottom that this website is optimized for Netscape 3) EDIT: for those that wants to see the result: Retrology

    Read the article

  • SQLAuthority News – Whitepaper Download – Using Star Join and Few-Outer-Row Optimizations to Improve Data Warehousing Queries

    - by pinaldave
    Size of the database is growing every day. Many organizations now a days have more than TB of the Data in their system. Performance is always part of the issue. Microsoft is really paying attention to the same and also focusing on improving performance for Data Warehousing. Microsoft has recently released whitepaper on the performance tuning subject of Data Warehousing. Here is the abstract about the whitepaper from official site: In this white paper we discuss two of the new features introduced in SQL Server 2008, Star Join and Few-Outer-Row optimizations. These two features are in SQL Server 2008 R2 as well.  We test the performance of SQL Server 2008 on a set of complex data warehouse queries designed to highlight the effect of these two features and observed a significant performance gain over SQL Server 2005 (without these two features). The results observed also apply to SQL Server 2008 R2.  On average, about 75 percent of the query execution time has been reduced, compared to SQL Server 2005. We also include data that shows a reduction in the number of rows processed and improved balance in parallel queries, both of which highlight the important role the Star Join and Few Outer-Row features played. I encouraged all of those interested in Data Warehouse to read it and see if they can learn the tricks. Using Star Join and Few-Outer-Row Optimizations to Improve Data Warehousing Queries Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • PHP MINISERVER DOWNLOAD RESUME-ERROR! Resource id # 4

    - by snikolov
    $httpsock = @socket_create_listen("9090"); if (!$httpsock) { print "Socket creation failed!\n"; exit; } while (1) { $client = socket_accept($httpsock); $input = trim(socket_read ($client, 4096)); $input = explode(" ", $input); $range = $input[12]; $input = $input[1]; $fileinfo = pathinfo($input); switch ($fileinfo['extension']) { default: $mime = "text/html"; } if ($input == "/") { $input = "index.html"; } $input = ".$input"; if (file_exists($input) && is_readable($input)) { echo "Serving $input\n"; $contents = file_get_contents($input); $output = "HTTP/1.0 200 OK\r\nServer: APatchyServer\r\nConnection: close\r\nContent-Type: $mime\r\n\r\n$contents"; } else { //$contents = "The file you requested doesn't exist. Sorry!"; //$output = "HTTP/1.0 404 OBJECT NOT FOUND\r\nServer: BabyHTTP\r\nConnection: close\r\nContent-Type: text/html\r\n\r\n$contents"; if(isset($range)) { list($a, $range) = explode("=",$range); str_replace($range, "-", $range); $size2 = $size-1; $new_length = $size-$range; $output = "HTTP/1.1 206 Partial Content\r\n"; $output .= "Content-Length: $new_length\r\n"; $output .= "Content-Range: bytes $range$size2/$size\r\n"; } else { $size2=$size-1; $output .= "Content-Length: $new_length\r\n"; } $chunksize = 1*(1024*1024); $bytes_send = 0; $file = "a.mp3"; $filesize = filesize($file); if ($file = fopen($file, 'r')) { if(isset($range)) $output = 'HTTP/1.0 200 OK\r\n'; $output .= "Content-type: application/octet-stream\r\n"; $output .= "Content-Length: $filesize\r\n"; $output .= 'Content-Disposition: attachment; filename="'.$file.'"\r\n'; $output .= "Accept-Ranges: bytes\r\n"; $output .= "Cache-Control: private\n\n"; fseek($file, $range); $download_rate = 1000; while(!feof($file) and (connection_status()==0)) { $var_stat = fread($file, round($download_rate *1024)); $output .= $var_stat;//echo($buffer); // is also possible flush(); sleep(1);//// decrease download speed } fclose($file); } /** $filename = "dada"; $file = fopen($filename, 'r'); $filesize = filesize($filename); $buffer = fread($file, $filesize); $send = array("Output"=$buffer,"filesize"=$filesize,"filename"=$filename); $file = $send['filename']; */ //@ob_end_clean(); // $output .= "Content-Transfer-Encoding: binary"; //$output .= "Connection: Keep-Alive\r\n"; } socket_write($client, $output); socket_close ($client); } socket_close ($httpsock); hey guys i have create a miniwebserver downloader it can download files from your server, however i am unable to resume my download when i download the file i get Resource id # 4 and also i cant resume the download,i would like to know how i can monitor record the client output how much bandwidth he has downloaded perl has something like this put its hardcore if possible kindly provide me with some pointers thank you :)

    Read the article

  • how to download all links at once..

    - by user216112
    hello friends,i want to ask u ..is there any software..that download all the link at once.. like e.g. in w3school.com. if we want to download all the php tutorial at once.someone has told me "tglepote".bt i am getting on the google.plz help

    Read the article

  • General Website Security

    - by Tom
    I pay monthly for a website hosting service that provides me with PHP and FTP support. I can upload my files and create directories and such. Now, I am wondering... If I upload a folder full of images.. or music.. basically personal stuff to my website and name it 'junk1234' can other people find it? Or even search engines? If so, How would I restrict any but those who know the folder name from seeing files in it? Possibly httaccess files? I also have cpanel installed.

    Read the article

  • Download file with progress bar in cocoa?

    - by happyCoding25
    Hello, I need to have a progress bar that responds to the percent complete of a download in cocoa. I think this might use things like NSProgressindicator and possibly NSTask. I'm not sure if theres an "official" method to download file in cocoa because up until now I just used curl with NSTask. Thanks for any replies.

    Read the article

  • How to access website on localhost from internet?

    - by Nitesh Panchal
    Hello, I turned my pc into router by allowing port forwarding on port 80 and got my hostname(xyz.donexist.org) registered through dyndns.com. Now i when i type my public ip address in browser i get redirected to my browser. I have installed glassfish and have my website deployed in glassfish. I want that when i type xyz.donexist.org my website should be opened. What all steps more i need to take? I have made a entry in etc/hosts file as :- 127.0.0.1 xyz.donexist.org Please guide me. I am and beginner. Thanks in advance :)

    Read the article

  • Cannot access website from inside network

    - by musclez
    I have a website running from my internal network available at the example IP 192.168.1.5. When I type this in to the browser, it redirects to my domain name ie, "example.com", and gives me Error code: ERR_CONNECTION_REFUSED. Any other machine that is inside of the network can access the website. The website is also accessible outside of the network. Other services from the server, like file sharing or ftp, are available to all machines in the network including the one i'm having issues http issues with. The issue may be linked to a proxy service, but from my understanding the service has been completely disabled and any executable have been uninstalled from the machine. I am wondering if there is some residual proxy information remaining on the machine that limits the connection. I'm fairly positive that "example.com" is what is being blocked by the local machine, and not an IP address being blocked or a faulty connection. When I examine the hosts file, there are no redirects to the local machine for "example.com". There was a rule, as on my other machines within the network: 192.168.1.5 example.com But i have since removed that for troubleshooting purposes. What intrigued me is that when I use the actual IP, the IP address will redirect to the domain in the browser and THEN say ERR_CONNECTION_REFUSED. Server-Side Results The server logs are reporting this: example.com ::1 - - [Date & time] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2. 2.22 (Unix) (internal dummy connection)" However, this seems to be irrelevant as it is not triggered when I try to connect to the server with the specified machine. Fiddler results: Host: *example.com* Proxy-Connection: keep-alive Chrome-Side [Fiddler] The connection to 'example.com' failed. Error: ConnectionRefused (0x274d). System.Net.Sockets. SocketException No connection could be made because the target machine actively refused it 01.23.45.67:80 01.23.45.67:80 would be the external IP, which the server and the machine in question both share. I am doing so reading into 0x274d and its coming back with .NET web.config information. I am still at a loss to what to do with this information. I have WireShark running as well. Theres is a lot of sensitive information in the readout and I'm not sure what to extract from it. Either way, if it helps, I can access that information if anyone would like me to. Thanks for the help!

    Read the article

  • Download multiple files from an array C#

    - by Sandeep Bansal
    Hi everyone, I have an array of file names which I want to download. The array is currently contained in a string[] and it is working inside of a BackgroundWorker. What I want to do is use that array to download files and output the result into a progress bar which will tell me how long I have left for completion. Is there a way I can do this. Thanks.

    Read the article

  • pdf download option should appear

    - by tibin mathew
    Hi, I need some code to download a pdf file. what i mean is normaly when we give like this href="./downloads/Intake Sheet rev 4-1-10.pdf" that will open the pdf in the same window, i dont want this , i need that to be downloaded , means need the download window to appear. i'm using php to develop my website. Please give me some idea. Thanks

    Read the article

  • [Python]Download an image embedded in a mime multipart message

    - by michele
    Hi, I have to download some images from links. This links return me a file where is embedded a multipart mime and a tiff image. I have writed this code but it downloads the file with mime. How I can remove the mime from this file and have the image returned? Can I do this with wget or curl? My code: def download(url,local): import urllib urllib.urlretrieve(url,local) urllib.urlcleanup() Thanks a lot.

    Read the article

  • error while adding web service to server in website panel

    - by sam
    I got following error while creating website for user in website panel.I am not able to create any hosting space in server's hosting plan it is showing 0 mb space. Stack Trace: [SoapException: System.Web.Services.Protocols.SoapException: Server was unable to process request. ---> System.UriFormatException: Invalid URI: The Authority/Host could not be parsed. at System.Uri.CreateThis(String uri, Boolean dontEscape, UriKind uriKind) at System.Uri..ctor(String uriString) at Microsoft.Web.Services3.WebServicesClientProtocol.set_Url(String value) at WebsitePanel.Server.Client.ServerProxyConfigurator.Configure(WebServicesClientProtocol proxy) at WebsitePanel.EnterpriseServer.ServiceProviderProxy.ServerInit(WebServicesClientProtocol proxy, ServerProxyConfigurator cnfg, String serverUrl, String serverPassword) at WebsitePanel.EnterpriseServer.ServiceProviderProxy.ServerInit(WebServicesClientProtocol proxy, ServerProxyConfigurator cnfg, Int32 serverId) at WebsitePanel.EnterpriseServer.ServiceProviderProxy.Init(WebServicesClientProtocol proxy, Int32 serviceId) at WebsitePanel.EnterpriseServer.WebAppGalleryController.InitFeedsByServiceId(Int32 UserId, Int32 serviceId) at WebsitePanel.EnterpriseServer.esWebApplicationGallery.GetGalleryApplicationsByServiceId(Int32 serviceId) --- End of inner exception stack trace ---] System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) +1485877 System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) +221 WebsitePanel.EnterpriseServer.esWebApplicationGallery.GetGalleryApplicationsByServiceId(Int32 serviceId) +68 WebsitePanel.Portal.WebAppGalleryHelpers.GetGalleryApplicationsByServiceId(Int32 serviceId) +31 can anybody help me in this.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >