Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 162/592 | < Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >

  • Analiytics: Can I set a goal on multiple events?

    - by David Parks
    We have a popup dialogue that requests users email address or facebook login. The page behind the popup loads, so a page view is counted. We want to measure: How many users ignored the popup completely How many users engaged the popup, but don't complete the process (we trigger an event when the user performs actions defined as "engaging") How many users completed the popup Bounce rates aren't telling because some users won't receive the popup. We are basically triggering events "PopupDisplayed" "PopupEngaged" and "PopupComplete", with labels to differentiate between email and facebook. But I don't think I can set goals to count "Users who received 'PopupDisplayed' AND 'PopupComplete'" events, so I can count how many users both saw the popup and completed it.

    Read the article

  • Tools to simulate mobile devices on a desktop to test websites

    - by Kris
    Are there any good tools that can be run on desktop machines (Windows or Linux) that can simulate a mobile device, preferably with some options as to screen size and mobile browser (user agent if not full render engine). I know it is never going to be perfect (especially without an actual touchscreen), but having a tool on our development machines to do what testing we can that way would be very useful.

    Read the article

  • Apache redircting when it shouldnt

    - by Ivan Moldavi
    I have inherited a LAMP server with a lot of customization. I want to enable mod_status in apache to get some monitoring of apache started. Problem is, with mod_status enabled in the httpd.conf, I am not allowed to travel to the http://"serverIP"/server-status URL because apache is redirecting me to a 'landing page'. If I type any invalid URL for the server, it redirects to this page, all error document pages have been hashed out from the httpd.conf. Any assistance to where this has may have been configured or if there is a way to override it so I can get to the url http://"serverIP"/server-status ?

    Read the article

  • Robot.txt can get all soft404s fixed?

    - by olo
    I got many soft404 in Google webmaster Tools, and those webpages aren't existing any more. thus I am unable to insert <meta name="robots" content="noindex, nofollow"> into my pages, and I've been searching a while but didn't get some valuable clues. There are about 100 URLs are soft 404, to redirect them all one by one is a bit silly as it would cost too much time for me. If i just add those links into robot.txt like below User-agent: * Disallow: /mysite.asp Disallow: /mysite-more.html if this way will fix all soft404s solidly? or if there is a way to change all soft404 to hard404? Please give me some suggestions. Many thanks

    Read the article

  • Resolving a cname using different DNS

    - by Sandeep Singh Rawat
    I have a domain name (e.g. abc.com) registered in GoDaddy and I have a few subdomains (mail, blog) correctly setup to a different hosts. Now I want to park my domain with a parking host (seohosting.com) which asked me to change my nameserver to their DNS. What I want is to only redirect dns queries for (www or @) cname to seohosting.com while still being able to use my other cname for my own purpose. Is there a way to do this? I dont have the host IP address for parking host.

    Read the article

  • Book recommend: Start learning web design with css with basic HTML knowledge

    - by Minh Hieu
    I've already known some HTML, tables, link, image,...etc but just at a basic level. Now I want to learn how to build a layout for a website and design also. I want to start building a layout right a way and just learning from it, not really like reading so much theories, explanations. Many books are so verbose, they teach from the beginning of HTML or explain things too much. I don't want to waste my time. So are there any good books for me?

    Read the article

  • Blogspot as a simple CMS

    - by G1ug
    Blogger/Blogspot recently released a new version of their software. This new version appears to have features relevant to a simple CMS (static page, albeit limited). I read from their Buzz Blog about a few websites that don't necessarily look like a typical Blogspot blog but rather somewhat a typical website deployed using a minimal CMS software: http://buzz.blogger.com/2011/07/you-can-do-some-amazing-things-with.html Can anyone point resources where I can learn how to do these? (Preferably case-studies with some steps how to create such website as oppose to Blogger HOWTO). Plus point if you can also tell me the infrastructure of Blogger.com (software stack, etc). Thanks

    Read the article

  • Scrapy cannot find div on this website [on hold]

    - by Jaspal Singh Rathour
    I am very new at this and have been trying to get my head around my first selector can somebody help? i am trying to extract data from page http://groceries.asda.com/asda-webstore/landing/home.shtml?cmpid=ahc--ghs-d1--asdacom-dsk-_-hp#/shelf/1215337195041/1/so_false all the info under div class = listing clearfix shelfListing but i cant seem to figure out how to format response.xpath(). I have managed to launch scrapy console but no matter what I type in response.xpath() i cant seem to select the right node. I know it works because when I type response.xpath('//div[@class="container"]') I get a response but don't know how to navigate to the listsing cleardix shelflisting. I am hoping that once i get this bit I can continue working my way through the spider. Thank you in advance! PS I wonder if it is not possible to scan this site - is it possible for the owners to block spiders?

    Read the article

  • Domain forwarding with url substitution in the address bar

    - by Mario Duarte
    Hello, I have a blog being served by a machine I have at home. Since the ip can change i set up a dyndns domain to always point to that machine. However, I purchased a more friendly domain (at godaddy.com) and I would like to forward it to that blog. The problem is that if I simply forward it the users will see the dyndns domain in the address bar and could potentially bookmark those urls and that's a problem. I noticed that godaddy.com has domain masking and although it does hide the dyndns domain in the address bar, it also keeps the same root address in the address bar even if I navigate to another page. I also have the feeling that search engines will not like this domain masking thing. Does anyone know how can I accomplish what I want?

    Read the article

  • General website publishing questions involving domain forwarding issue

    - by Gorgeousyousuf
    Even though I have been having a certain level of knowledge and experience about web development I have never interested in obtaining a domain and publishing a website from my own server. Since today I have been struggling with getting my own domain and configuring it utilizing web sources. I started with learning the outline of web publishing process including web server installation, deploying a website for testing purpose,router port forwarding, getting a domain and forwarding domain to my router which will also forward http requests to my web server I am confused about some parts and so far could not get the web site accessed from outside of the network. All I try to do is just for learning purpose so I do not pay much attention to security issues for now. I have Server 2008 and IIS 7.5 installed. I use a laptop and have access to the modem over wireless and my modem is Zoom x6 5590. Well I will continue explaining what I have done so far and what I think will be after each action I did, I have successfully had access to my website on any local computer entering the internal ip address and port pair of the host machine in a browser. Next, I forwarded port 80 of my host machine creating a virtual server like 10.0.0.x(internal ip(static) of the host) - tcp - start port : 80 - end port : 80 in router options. Now I suppose every request that will come to the public Ip on port 80 will be forwarded to my host machine(10.0.0.x) over port 80. So If everyhing went as desired, the website listening on port 80 will accept the request and process the issue and finally respond bla bla bla... I suppose to access my website from outside of the network by entering http://MyPublicIp:80 in a browser but I couldn't accomplish this task by now despite using godady's domain forwarding tool,I see a small view of my website when I click the "preview" button that checks whether the address(http://publicip/Index.aspx) I entered where my domain will be forwarded is available or not. I am sure that configuring domain does not play a role in solving such a problem since using public ip and port matching does not help. So here is the first question, What is the fact that I face this problem? After that, I have couple of question regarding domain forwarding using godaddy tool. Can I forward my domain to a any port for example port 8080 other than default http port 80? Additionally, can I use a sub-domain to forward to a different port of the host? What I want to design is if the client enters www.mydomain.com, website1 will respond over a specified port and after when a client enters info.mydomain.com, another website which listens on different port will respond. I tried to add a sub-domain and forward it to a address like http://www.mydomain.com:8080/Index.aspx with no success. Can I really do that? Finally, what if I have a ftp site listening on the default port 21 and I create a domain like ftp.mydomain.com that will forward to that ftp site address. Is it possible to use sub-domains for ftp site access? I know I am more than confused but no matter whatever and however you reply to me, you will help me have a more clear view on this subject. Thank you very much from now.

    Read the article

  • Googlebot fetches my pages very frequent, rel-nofollow, meta-noindex or robots.txt-disallow

    - by trante
    Googlebot fetches pages in my site very frequently. And this slowens my website. I don't want Googlebot to crawl too frequent. I decreased crawl rate from Google webmaster tools. But I'm supposing to use these three tools: Adding rel="nofollow" to my inner pages. So Googlebot won't crawl and index them. Adding meta tag "noindex" so Google will remove this page from index and won't get it again. Adding Disallow: /mySomeFolder/ to robots.txt and Googlebot won't crawl that pages. I'm planning to use these methods for my 56.000 pages, except the most important 6-7 pages. Which method would you prefer and what would be disadvantages or advantages ? Or won't it change my website speed etc..

    Read the article

  • Showing schema.org aggregate rating in Google rich snippets

    - by nickh
    After adding schema.org microdata markup for reviews and aggregate ratings, I expected review and rating information to show up in rich snippets. Unfortunately, neither are being shown. Google's Structured Data Testing Tool finds the microdata, and there're no errors or warnings on the page. Any idea what's wrong with the microdata markup? Example 1: Live Page: http://www.shelflife.net/ljn-thundercats-series-3/bengali Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fljn-thundercats-series-3%2Fbengali Example 2: Live Page: http://www.shelflife.net/star-wars-mighty-muggs/asajj-ventress Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fstar-wars-mighty-muggs%2Fasajj-ventress

    Read the article

  • problem connecting to magento connect

    - by amir
    hi, I'm using magento 1.4.0 and when I try to get to magento connect and download a plugin the page will say Error: Please check for sufficient write file permissions Your Magento folder does not have sufficient write permissions, which this web based downloader requires. If you wish to proceed downloading Magento packages online, please set all Magento folders to have writable permission for the web server user (example: apache) and press the "Refresh" button to try again. does anyone know how I can fix this problem, thanks Update: the plugin I'm trying to use is MagentoPycho light box so I unpacked the folder into the app/code/local but it still doesn't show in the admin area

    Read the article

  • IIS7 Serving .mp4s, not playable on iOS devices.

    - by Danomite
    I've added the proper MIME type to the server, made sure it applies to not only the specific site but even the all of server's sites. The file is accessible and playable in my browser (Chrome) but when trying to pull it up on an iPhone , the debug mode warns me that the "movie could not be played" but on iPad it's "byte_range_error_message" I'm really at a loss here of why iOS devices won't load the video up. I know it's not the video files themselves because I had used the same file on a different server (on a shared hosting package). Any help is appreciated! -Dan

    Read the article

  • Why Does the Same Site in Different SERPs Contain and Not Contain Google Sitelinks

    - by frank13
    Is there a way to get sitelinks on a Google SERP when searching on a site's name vs. the sites' web address? Example, if you search "twin city kings", the first result is the website for Twin City Kings without sitelinks. But if you search "twincitykings.com", the first result is the website for Twin City Kings with sitelinks. Is it possible to get sitelinks on both SERPs? Thanks for helping or clarifying. Note: this question does not pertain to "how to get a sitelink". It pertains to why does the same site come up in different SERPs with and without Sitelinks.

    Read the article

  • website address point to localhost

    - by munir ahmad
    Today in firefox while surfing the internet, I opened a website it asked "This site may harm your computer" if you want to open this website add it in exception list. I added a website under exception list and trust this website. After that situation, whenever I opend this website, it always points toward the localhost untill internet connected. I have setup localhost through apache (xampp server). If internet not connected this website do not open anything but localhost still work as usaual. How can I remove this situation so that this website do not point to locathost? I am using winxp sp3. Same problem now appear in all browsers too.

    Read the article

  • How can I parse Amazon S3 log files?

    - by artlung
    What are the best options for parsing Amazon S3 (Simple Storage) log files? I've turned on logging and now I have log files that look like this: 858e709ba90996df37d6f5152650086acb6db14a67d9aaae7a0f3620fdefb88f files.example.com [08/Jul/2010:10:31:42 +0000] 68.114.21.105 65a011a29cdf8ec533ec3d1ccaae921c 13880FBC9839395C REST.GET.OBJECT example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg "GET /example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg HTTP/1.1" 200 - 32957 32957 12 10 "http://atlanta.craigslist.org/forums/?act=Q&ID=163218891" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19" - What are the best options for automating the log files? I'm not using any other Amazon services other than S3.

    Read the article

  • Does placing Google Analytics code in an external file affect statistics?

    - by Jacob Hume
    I'm working with an outside software vendor to add Google Analytics code to their web app, so that we can track its usage. Their developer suggested that we place the code in an external ".js" file, and he could include that in the layout of his application. The StackOverflow question "Google Analytics: External .js file covers the technical aspect, so apparently tracking is possible via an external file. However, I'm not quite satisfied that this won't have negative implications. Does including the tracking code as an external file affect the statistics collected by Google?

    Read the article

  • Mod Rewrite - directing HTTP/HTTPS traffic to the appropriate virtual hosts

    - by kce
    I have an Apache2 web server (v. 2.2.16) running on Debian hosting three virtual hosts. The first two hosts are HTTP only (server1 and server2). The last host is HTTPS only (server3). My virtual host configuration files can be found at pastebin. I would like to use mod rewrite to get the following behavior: Any request for http://server3 is re-directed to https://server3 Any request for either https://server1 or https://server2 is re-directed to http://server1 or http://server2 as appropriate. Currently, requesting http://server3 gives you a 403 because indexing is disabled for that host and a request for https://server1 or https://server2 will resolve as https://server3 (as its the only virtual host running SSL). This behavior is not desirable. So far I have added a rewrite rule to the central configuration file (myServerWideConfs.conf), with unfortunately no effect. I was under the impression that this rule (or something similar) should rewrite all https:// requests for server1 and server2 to the proper http:// request. RewriteEngine On RewriteCond %{HTTP_HOST} !^server3 [NC] RewriteRule (.*) http://%{HTTP_HOST} My question is two-fold: What mod rewrite rules should I use to accomplish this? And where should they go? Debian's packaging of Apache has a pretty granular (i.e., fractured) configuration file layout; should my rewrite rules go in /etc/apache2/apache2.conf, /etc/apache2/conf.d/myServerWideConfs.conf, or the individual virtual host files? Is mod rewrite the right tool to accomplish this or am I missing something in my greater apache configuration?

    Read the article

  • Linking a facebook app's page to an existing facebook business page

    - by Dan
    I have a facebook app page, and a separate facebook business profile page. The business page was created, but not by me, some time before the app and its page were created. Is there any way to connect the two pages, or import the content and friends from one to the other? The older profile page has some content; a set of friends and wall posts that I don't want to lose. It was created before I had a chance to set up an app page. Since the app was created more recently, it does not have any content posted to it. I intended the app page to eventually hold some advertising info for my main website itself (non-canvas, just using fb for the connect api etc). The idea being that as people sign up on my site through facebook's OAuth, I could use the graph api to post to their wall. The wall posts are working as expected but naturally they are directing users to the facebook app page, which has no content, friends etc. I'd prefer to be directed to the original business page, where the party is really happening. Now it seems that the two pages are completely separate; what would I need to do to direct the users to the business page?

    Read the article

  • How to handle geo sitemaps?

    - by Floran
    I have a site with company profiles. I already have a sitemap.xml However, now I'm reading on KML geo sitemaps: http://www.seomoz.org/blog/understand-and-rock-the-google-venice-update I wondered how that may apply to a site with company profiles. Do I place 1 large KML file showing all the locations of businesses in the KML format? Or do I need to make that more specific? locatins-in-LA.kml for example? And what is the relationship between a kml file and a sitemap.xml? It seems that I need to have a reference to the kml files that should be included in the sitemap.xml, but if that is the case, I don't know how. Help! :)

    Read the article

  • 301 redirect for a page with a space in it

    - by Outerbridge Mike
    I have some pages from a client's old template-based site which have spaces in them. For example, one of the pages looks like this: example.com/page.php?domain_name=example.com&viewpage=Gallery %26 News I'm thinking that the correct way to do an htaccess 301 redirect is to include something like this: Redirect 301 /page.php?domain_name=example.com&viewpage=Gallery%20%26%20News http://www.example.com/gallery/ where the new page is: example.com/gallery Is this correct?

    Read the article

  • Migration from one domain to another - Transfering the social media stats

    - by Dipak Saraf
    I am planning to move my site from one domain to another i.e from domain a.com to b.com . The site also has a lot of content and the migration of content is not an issue. The 301 redirect will take care of all the backlinks also. But my real worry is transfer the social media shares links and stats from domain a.com to b.com. I need some insight or any way in which the same can be migrated seamlessly from domain a.com to b.com

    Read the article

  • Webmaster tools, Duplicate Meta Descriptions, and Short Meta Descriptions [closed]

    - by Watsy91
    Possible Duplicate: Do meta keywords have any impact on ranking algorithms? I am fairly new to the whole Webmaster Tools concept. I have been looking at all the different options, such as crawl errors, HTML improvements etc... I have been looking at the Duplicate Meta Descriptions and Short Meta Descriptions, was wondering if anyone could suggest ideas on how to go about improving this. It seems that all the Duplicates are from the URL Title and the short description. It would seem to me that most people would have information regarding the page with the same keywords as their titles. Heres an example of one: These are the ultimate hampers in taste, quality and value. Amongst this range of luxury hampers ar /food-hampers/food-hampers-over-100.html /thank-you-gifts/large-gifts-over-100.html To get to the point I just want to know do these things really matter? Would they have a real consequence on my sites rankings? My sites have been falling down the rankings since early this year and I have really started to look at Google Analytics and Webmaster tools to try and indicate certain problems. I have researched the Internet and it seems that some people don't bother and others do!! I know that Stackoverflow has 100s+ people who have went through the above and I would really appricate if they could give me some tips etc. Or in the END does it really matter?? :D

    Read the article

  • How should I structure my urls for both SEO and localization?

    - by artlung
    When I set up a site in multiple languages, how should I set up my urls for search engines and usability? Let's say my site is www.example.com, and I'm translating into French and Spanish. What is best for usability and SEO? Directory option: http://www.example.com/sample.html http://www.example.com/fr/sample.html http://www.example.com/es/sample.html Subdomain option: http://www.example.com/sample.html http://fr.example.com/sample.html http://es.example.com/sample.html Filename option: http://www.example.com/sample.html http://www.example.com/sample.fr.html http://www.example.com/sample.es.html Accept-Language header: Or should I simply parse the Accept-Language header and generate content server-side to suit that header? Is there another way to do this? If the different language versions don't have different urls, what do I do about the search engines?

    Read the article

< Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >