Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 214/990 | < Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >

  • Can I use a 302 redirect to serve up static content from an URL with escaped_fragment?

    - by Starfs
    We would like to serve up SEO-friendly Ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the .htaccess file, that takes the ?_escaped_fragment= string and send that to a static page?, for example /snapshot/yourfilename/. How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ URL to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • Pages partially load on rapid refresh

    - by user101570
    I recently set up a VPS slice with 256MB to run a LAMP stack (Ubuntu 11.04, Apache2, Mysql, PHP5). So far I'm only running a simple Wordpress site on an IP-based virtual host I set up. The performance is excellent, but I've noticed that if I send multiple HTTP requests from the same IP in a short time period, only partial pages are rendered. Then if I wait a bit and refresh the page, the entire page loads again. I noticed this behaviour when accessing the site from two browsers from my office desktop, but it also presents itself if I quickly navigate the site from a single browser (any browser). I'm guessing this is an Apache phenomenon, as the pages are rendered correctly except under the conditions above, but perhaps I'm wrong here. Could it be my hosting company with some kind of DOS protection in place? As a relative Linux/server noob, I'd really appreciate any insight into what settings in Apache could explain this behaviour, and how I might go about changing it.

    Read the article

  • Summary of usage policies for website integration of various social media networks?

    - by Dallas
    To cut to the chase... I look at Twitter's usage policy and see limitations on what can and can't be done with their logo. I also see examples of websites that use icons that have been integrated with the look and feel of their own site. Given Twitter's policy, for example, it would appear that legal conversations/agreements would need to take place to do this, especially on a commercial site. I believe it is perfectly acceptable to have a plain text button that simply has the word "Tweet" on it, that has the same functionality. My question is if anyone can provide online (or other) references that attempt to summarize what can and can't be done when integrating various social networks into your own work? The answer I will mark as the correct one will be the one which provides the best resource(s) giving the best summaries of what can and can't be done with specific logos/icons, with a secondary factor being that a variety of social networking sites are addressed in your answer. Before people point to specific questions, I am looking for a well-rounded approach that considers a breadth of networks and considerations. Background: I would like to incorporate social media icons and functionality, but would like to consider what type of modifications can be done without needing to involve lawyers. For example, can I bring in a standard Facebook logo, but incorporate my site color into the logo? Would the answer differ if I maintained their color, but add in a few pixels of another color to transition? I am not saying I want to do this, but rather using it as an example.

    Read the article

  • Hide directory contents from showing when accessing the URL directly

    - by SoLoGHoST
    On my site, if you browse to http://example.com/images/ the contents of the entire directory are shown like so: How can I make it so that this doesn't show up when people browse directly to http://example.com/images/? Can I create an .htaccess file in that directory? Or is there a better way? I really don't want people being able to do this for the entire site (i.e. every directory on that site). What can I do to prevent this? I figure it's either something that has to be done in Apache or using an global .htaccess file and placing it in the public_html folder perhaps? EDIT I diverted this using an index.php file, but I still feel that security is an issue here, how can I fix this permanently?

    Read the article

  • SharePoint 2010 MySites - Simple explanation needed!

    - by Chris W
    I've been playing around with the 2010 beta for a couple of weeks, experimenting with topology options etc. I think I've got myself totally confused as to how it works hence if there's any SP experts out there that can explain things in simple terms for me I'd appreciate it! I want to setup a farm with 3 servers providing the content & MySites. I presume that the way to do this is to load balance or DNS round robin traffic between the 3 servers. The bit where I'm confused is that My Site Settings page asks for a specific My Site Host hence all my site traffic will be pushed to a single server even though we have 3 in the farm. If this hosts fails I presume MySites will be unavailable. Is this right? How do I configure it so that access to MySites is load balanced across the 3 servers in the farm?

    Read the article

  • Restart single uWSGI application (when it's in emperor mode)

    - by Oli
    I'm running uWSGI in emperor mode to host a bunch of Django sites based on their individual configs. These are supposed to update when it detects a change in the config file and this largely works when I just touch uwsgi.ini the relevant file. But occasionally I'll mess something up in the Django site and the server won't load. Yeah, yeah, I should be testing better but that's not really the point. When this happens, uWSGI seems to mark the site as dead and stops trying to run it (seems to make sense). Even after I fix the underlying issue, no amount of touching will get that site's uWSGI process up and running. I have to reload the whole uWSGI server (knocking dozens of sites out at once for a few seconds). Is there a way to force uWSGI to just reload one of its sites?

    Read the article

  • Mitigating the 'firesheep' attack at the network layer?

    - by pobk
    What are the sysadmin's thoughts on mitigating the 'firesheep' attack for servers they manage? Firesheep is a new firefox extension that allows anyone who installs it to sidejack session it can discover. It does it's discovery by sniffing packets on the network and looking for session cookies from known sites. It is relatively easy to write plugins for the extension to listen for cookies from additional sites. From a systems/network perspective, we've discussed the possibility of encrypting the whole site, but this introduces additional load on servers and screws with site-indexing, assets and general performance. One option we've investigated is to use our firewalls to do SSL Offload, but as I mentioned earlier, this would require all of the site to be encrypted. What's the general thoughts on protecting against this attack vector? I've asked a similar question on StackOverflow, however, it would be interesting to see what the systems engineers thought.

    Read the article

  • Hosting and domain registrations for multiple clients

    - by letseatfood
    I am finally getting regular work desiging, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

  • How to Edit aspx, Cshtml and other kind of files live on FTP server ?

    - by Anirudha
    Originally posted on: http://geekswithblogs.net/anirugu/archive/2013/06/27/how-to-edit-aspx-cshtml-and-other-kind-of-files.aspxMany time we just want to make a small changes on site and we don’t want to download the whole project again. In this post I will show you some good way to do it.   People who have Expression Web 4 can do it. I tried it and it’s work good with aspx file. If you have site in asp.net and use aspx engine then this is a good option. Well, Expression Web is free (previously paid software). A another good option is Komodo Edit. You can use komodo edit and few plugin to make FTP editing work for you. The problem in these 2 apps are they don’t have syntax highlight and support for CSHTML file which are introduced with MVC 3. For this I suggest you to go with webmatrix. You can use Webmatrix to edit cshtml file online. Remember that Webmatrix don’t support Compiling of MVC project. You need Visual Web developer Express at-least to compile your project. if you are in hurry try  https://c9.io/ put your FTP settings and you just got your hands ready to make changes on live site. If you have anything else in your mind share it here.

    Read the article

  • SharePoint Returning a 401.1 for a Specific User/Computer

    - by Joe Gennari
    We have a SharePoint Services 3.0 site set up supporting about 300 users right now. This report is isolated and has never been duplicated. We have one AD user who cannot log into the SharePoint site with his account from his machine and is subsequently returned a 401.1 error. If any other user tries to log on with their account from his machine, it works okay. If he moves to another machine and logs on, it works okay. The only solution to this point has been to install FireFox on the machine. When he authenticates with FF, everything is okay. Remedies tried so far: Cleared cookies/cache Turned off/on Integrated Windows Authentication in IE Downgraded IE 8 to IE 6 Removed site from Intranet Sites zone Renamed the machine Disjoined/Rejoined Domain

    Read the article

  • design question for transportation agency/workflow system

    - by George2
    I am designing a transportation agency/workflow system, and it including 3 types of people, customer who requests to transport some stuff, drivers who deliver the stuff, and truck manager who manages transport source/destination truck coordination and communicates/organizes drivers. The system is expected to be a web site, and 3 kinds of people could use the web site to submit request, accept request, monitor status of specific stuff transportation, etc. The web site is more like an open agency or a workflow system. I am wondering whether there are any existing technologies, tools or projects (better to be open source, but not a must) which I could build my application faster based on? I prefer to use .Net technologies, but not a must. Thanks in advance!

    Read the article

  • IIS 7 SSO stops working during high CPU load? [migrated]

    - by DanB
    On our IIS7 site (Windows 2008 Server), we have set up single sign-on (SSO). It seems to work fine most of the time, but when the CPU load becomes high, SSO authentication completely stops working. I did some research and tried this suggestion to increase the max number of worker processes in the default app pool, but the increase did not help. Some details: The site is a WordPress blog. The server has plenty of RAM (2 GB) and free disk space. SSO is achieved by putting a copy of the WordPress login page (wp-login.php) into a subfolder below the root that has anonymous authentication disabled, and then redirecting the browser to it. This was the recommendation of Microsoft given to our consultants. To increase CPU load for testing, I have three scripts hit the home page simultaneously, over and over. This drives CPU to 100%. When these scripts are running, SSO authentication simply doesn't happen. As soon as I stop the scripts, SSO works again. (I should mention that the SSO problem also happens when many users visit the site at once....) The WordPress database process (mysqld) is not stressed at all by the scripts. I would be happy to provide further diagnostics. Any help appreciated!

    Read the article

  • IIS and PHP restrict IO permissions

    - by ULTRA_POROV
    I have php installed trough a fastCGI module. Is there a way to restrict the module (php.exe) read / write permissions to only the directory (+ subdirs) of the IIS site that is calling it? I need this to prevent one IIS PHP site from having access to files outside its own directory. How to do this? Is there a setting in php.ini or in the IIS configuration? I believe such a feature could exist, because when a file on the server is requested the root path of the site is also known, all it would take is that IIS passes this path to the php module, and the php module should on its end allow only IO operations within this path. PS: I know it is possible to achieve this by using a different windows account for each website, this is not an option.

    Read the article

  • Recoomend company to take care or webserver and wordpress management?

    - by javipas
    I'm interested in setting up a professional WordPress site but I'd like to explore the pssibilities to leave the management of the webserver and even WordPress' management to a company that guarantees great availability, performance of the site (load times, security) and even SEO. My site is currently running on other platform but I plan on a migration on the next 4 weeks. I've done this usually, but I'd like to focus on the content, so I don't have to mess with webserver/mysql/php configs in order to get nice performance. Is there some (maybe hosting) company that is dedicated to this? Would it be better to hire a sysadmin with experience in those matters?

    Read the article

  • Linux foxboard network monitor

    - by het.oosten
    I want to use a Foxboard a simple network monitor for multiple routers (all routers are connected to the internet). Foxboard is a mini pc with an embedded version of Debian. My idea is to use multiple virtual network devices like this: eth0 192.168.2.10 eth0:1 192.168.3.10 eth0:2 192.168.4.10 I found a nice Python script to ping an external host here (the solution from Ryan Cox): http://stackoverflow.com/questions/316866/ping-a-site-in-python Is it possible to configure Debian to use eth0 when I ping www.site-a.com and eth0:1 when I ping www.site-b.com?

    Read the article

  • Multiple SSL on same IP [closed]

    - by kadourah
    Possible Duplicate: Multiple SSL domains on the same IP address and same port? I have the following situation: first domain: test.domain.com IP: 1.2.3.4 Port: 443 SSL: Purchased from godaddy and specific to that domain Works fine no issues. I would like to add another site: test2.domain.com IP: the same Port: can be different SSL: different since I can't use the SSL above because it's specific to the site above. Now, when I add the HTTPS binding to the second site with IP:Port combination it appears to always load the first SSL ignoring the second certificate. How can I add second SSL binding to the same IP using a "different" certificate? Can this be done?

    Read the article

  • How do I troubleshoot a page not found error when configuring IIS6 Windows Server 2003? [Page Not Found]

    - by Vinicius Ottoni
    I have configured IIS6 in my windows server 2003 with this link: http://www.simongibson.com/intranet/iis6/ After that I create a new web site inside Web Sites directory. Inside the physical path I created an index.htm that has: <html> <body>Test</body> </html> But I got the following error: "The page cannot be found". When I put the same index file inside the Web Site Default physical path, it works. I configured the new web site with the link above using the IP configuration and without a Host Header.' What should I do to troubleshoot this or is there an obvious configuration error?

    Read the article

  • Run single php code using multiple domains

    - by Acharya
    Hi all, I have a php code/site at xyz.com. Now I want to run the same site using multiple domains means when somebody open domain1.com, domain2.com ,domain4.com, so on urls, it should run the code/side that is at xyz.com I know one way to do this. I can host all these domains to the server where xyz.com is hosted so all domains will point to same peace of code/site. n above solution i need to hosted the domains manually. Is there any other way to do this as I want to add domains dynamically? Thanks in advance!

    Read the article

  • Why do Google search results include pages disallowed in robots.txt?

    - by Ilmari Karonen
    I have some pages on my site that I want to keep search engines away from, so I disallowed them in my robots.txt file like this: User-Agent: * Disallow: /email Yet I recently noticed that Google still sometimes returns links to those pages in their search results. Why does this happen, and how can I stop it? Background: Several years ago, I made a simple web site for a club a relative of mine was involved in. They wanted to have e-mail links on their pages, so, to try and keep those e-mail addresses from ending up on too many spam lists, instead of using direct mailto: links I made those links point to a simple redirector / address harvester trap script running on my own site. This script would return either a 301 redirect to the actual mailto: URL, or, if it detected a suspicious access pattern, a page containing lots of random fake e-mail addresses and links to more such pages. To keep legitimate search bots away from the trap, I set up the robots.txt rule shown above, disallowing the entire space of both legit redirector links and trap pages. Just recently, however, one of the people in the club searched Google for their own name and was quite surprised when one of the results on the first page was a link to the redirector script, with a title consisting of their e-mail address followed by my name. Of course, they immediately e-mailed me and wanted to know how to get their address out of Google's index. I was quite surprised too, since I had no idea that Google would index such URLs at all, seemingly in violation of my robots.txt rule. I did manage to submit a removal request to Google, and it seems to have worked, but I'd like to know why and how Google is circumventing my robots.txt like that and how to make sure that none of the disallowed pages will show up in their search results. Ps. I actually found out a possible explanation and solution, which I'll post below, while preparing this question, but I thought I'd ask it anyway in case someone else might have the same problem. Please do feel free to post your own answers. I'd also be interested in knowing if other search engines do this too, and whether the same solutions work for them also.

    Read the article

  • Which programming language should I choose I want to build this website ...? [closed]

    - by Goma
    Assuming that I will start with just phot sharing website. Every user can add comments to any photo. After that the site will contain news (general news), the admin can add any news and the moderators as well while the users can also add comments on this news. The website will aslo provide photos uploader, so every user will have up to 20 MB ti upload any photos they want. Other users can see these photos or can not depending on the option that the main user chose(if he wants to publish his photos or not). The site should have a small type of forum which provide the ability for admin to ad categories and for user to add topics and replies for each topic in these categoris. These are the things that I can think of now, but the website will add other features as well and services later on. Can you tell me now which programming language can help me to do all that? I need a programming language that provdies the follwing: 1- speed load for pages of the site. 2- easy to add more functions quickly and easy to edit code for any reason. 3- Secure 4- fast in displaying infromation from database.

    Read the article

  • Serve up syntactic XHTML5 using the text/html MIME type?

    - by cboettig
    I have a site currently written with HTML5 tags. I'd like to be able to parse the site as XML, with support for namespaces, etc, to facilitate programmatic extraction of data. Currently I have <!DOCTYPE html> and <meta charset="utf-8"> Which I gather is equivalent in HTML5 to explicitly setting the content-types as <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> for my current setup. In order to serve XML it sounds like the right thing to do is <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> Should I also change my Content-Type to <meta http-equiv="content-type" content="application/xhtml+xml; charset=iso-8859-1" /> Or is that not necessary? What is the advantage of having content-type be "application/xhtml+xml"? What is the disadvantage? (Sounds like it may break internet explorer rendering of the site? but maybe that information is out of date now?) Many thanks!

    Read the article

  • Remote Desktop Problem on Windows Server 2008 R2

    - by lukiffer
    Revised this question to be more concise, consolidating several revisions. Symptoms: From a domain-member Windows 7 Client: Domain credentials to a domain controller = success Domain credentials to a member server (by hostname or FQDN) = success Domain credentials to a member server (by IP) = fail Local credentials to a member server (by either) = success From a non-domain-member Windows 7 Client: Domain credentials to a domain controller = success Domain credentials to a member server = fail Local credentials to a member server = success (Identical behavior from a Mac RDC 2.1 client) Server Configuration Details: Windows 2008 R2 Datacenter w/ SP1 The domain in question is a subdomain of a Windows 2008 domain (forest root). Root has DCs in both Site A and Site B, subdomain only has DCs in Site B. RDP is operating normally on all root member-servers and DCs. No remote desktop settings are defined by GPOs. Network level authentication is enabled; all clients are compatible and the certificate exchange/SSL handshake completes successfully. Not catching any errors in netlogon log.

    Read the article

  • How do spambots work?

    - by rlb.usa
    I have a forum that's getting hit a lot by forum spambots, and of course the best way to defeat something is to know thy enemy. I'll worry about defeating those spambots later, but right now I'd like to know more about them. Reading around, I felt surprised about the lack of thorough information on the subject (or perhaps my ineptness to input the correct search terms for better google results). I'm interested in learning all about spambots. I've asked on other forums and gotten brush-off answers like "Spambots are always users registering on your site." How do forum spambots work? How do they find the 'new user registration' page? (I'm especially surprised because some forums don't have a dedicated URL for this eg, www.forum.com/register.html , but instead use query strings or even other methods invisible to the URL bar) How do they know what to enter into each 'new user registration' field? How do they determine what's a page they can spam / enter data into and what is not? Do they even 'view' this page at all? ..If not, then I'd assume they're communicating with the server directly - how is - this possible? How do they do it? Can forum spambots break CAPTCHAs? Can they solve logic questions (how?)? Math questions? Do they reverse-engineer client-side anti-bot validation scripts? Server-side scripts? What techniques are still valid to prevent them? Where do spambots come from? Is someone sitting behind the computer snickering as they watch their bot destroy site after site? Or are they snickering as they simply 'release' it onto the internet somehow? Are spambots 'run' by an infected computer somewhere? Do they replicate themselves? etc

    Read the article

  • SharePoint doesn't support this authentication scheme.

    - by EtherDragon
    I have a new Windows Phone 7 phone, and I'm trying to investigate how to connect the Office application to our SharePoint site(s). In the Office application, on Phone 7, I flip to the SharePoint page. I go to open URL, and enter the url for one of my sites, that uses default authentication (Windows Auth). I get a message: Can't open SharePoint doesn't support this authentication scheme. For assistance, contact the person who manages thus SharePoint site (That would be me). You can try opening the content in your web browser instead. When opening in my browser, I can access the content without any problem. (Windows Auth passes) Anyone have any source material on what I should do to my SharePoint site to "support this authentication scheme?" Note: I am the administrator of our SharePoint server farm(s).

    Read the article

  • .htaccess redirect not preserving http_referer header

    - by CodeToaster
    We're merging with another company, and we want to redirect content from their (Apache) website to our (IIS) site. When traffic arrives at our site, we inspect the HTTP_REFERER, and if the visitor was just redirected from the company's site that we just merged with, they'll be presented with a "splash" page announcing the merger. I've added the line... Redirect / http://www.oursite.com/ ...to their .htaccess, which works fine, except that when the browser is redirected it doesn't send the HTTP_REFERER header. I've tried redirecting with redirect codes 301, 302 and 307 (the default, I believe, is 302) and all have the same effect (redirects fine, but no HTTP_REFERER). Can anyone provide some insight into why HTTP_REFERER wouldn't be included?

    Read the article

< Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >