Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 340/592 | < Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >

  • How would a search engine see url encoded characters?

    - by K20GH
    I've got my URL however some of the strings would contain &. Obviously I can't use them as best practice so I've replaced them with +. However if I encoded my & instead it would become %26. How would a search engine see that? Would it see %26 as a & so still bring back the URL or would it just see it as a %26? ie. Would www.example.com/sweet?m&m show as that, or would they see it as www.example.com/sweet?m%26m

    Read the article

  • When choosing a domain does including your brand affect SEO performance?

    - by bpeterson76
    I've been asked to build a "landing page" for a local branch of an international corporation. While the corporation has a well-established domain name, the local office wants to use a unique, separate url that will be easy for them to relay to clients. However, the corporation is considered a category leader, so the local office is also concerned about the importance to carrying over the company's brand to the URL. Questions that have arisen: From an SEO perspective, is there a benefit to including the brand name in the URL? Would it be more beneficial to buy a domain that relates generically to the INDUSTRY as opposed to the specific brand name? Would the benefits of an easy-to-remember, short domain outweigh any SEO benefits that might be gained by a longer, brand-specific domain?

    Read the article

  • Should I use heroku or should I have my own ssl? [closed]

    - by user1744649
    Base on your experience, can you please advice what will be better for me? Issue : I build applications and there are 2 major constraints. 1. ssl is needed since I used facebook api's. So, only heroku is a good option. 2. My web components tend to hit the Max_Execution_Time very often, since I pull a lot of data using the facebook api. Future possible purpose of this site : 1. Will use more apis from google, twitter, future. 2. Might request for donations. 3. Just for hobby. I have two options : 1. Create a web site in heroku itself by converting all the php components to a background worker in python using django. 2. Dont use heroku at all. Do the the complete hosting with godaddy (shared plan). And buy an ssl so that I can use fb apis etc. In this scenario, what do you suggest me to do?

    Read the article

  • What is the most secure environment for multiple CMS sites? [closed]

    - by Brian Gulino
    I wish to run about 50 Joomla or WordPress low-traffic websites on 1 server, or part of a server. Each website will be managed by its own, naive owner who will have be able to access the Joomla or Wordpress backend of the website. I am concerned about security and isolation as my users will periodically get into trouble by not protecting their sites properly. Two alternatives I know of exist: Run one Linux system with multiple websites under Apache. Follow current Joomla and WordPress security tips. Increase the isolation of the individual sites by using mpm-itk, which will allow each website to run as its own user. The alternative to this is to run virtualization software such as the Xen hypervisor. Each site would have its own, virtual Linux system. I lack the experience needed to make this decision and I am asking which path to take. Obviously, there may be other alternatives that I haven't considered.

    Read the article

  • Is it possible to get free web host for my registered domain? [closed]

    - by Ahmed Alsayadi
    Possible Duplicate: How to find web hosting that meets my requirements? I searched online for many free web hosting websites like NetFirms, most of them asking you to register for their sub-domain or to buy a new domain, but I already have one which I bought through GoDaddy. Now I hope to find a free web host for my website (site's size less than 20 MB). Any idea which web hosting can meet such requirements?

    Read the article

  • Since Google reduces the value of links alongside nofollow links, what is an alternative?

    - by SharkTheDark
    Since 2009, Google counts nofollow links also as outgoing links, and thus reduces the value of the other links. What are some alternatives to stop Google counting outside links from my page? If I make links appear on my page source like this: <span hrefs="http://link" rel="nofollow" link="true">Link Name</span> and then in JavaScript replace span with a tag and replace hrefs with href for every span tag that has link="true". Will this help?

    Read the article

  • Is there any Google Adsense revenue if a visitor rolls over (hovers) on an ad unit?

    - by torr
    I have noticed an increase in interactive flash animations especilly on 300px wide adsense ads. Many of them ask the visitor to rollover to either reveal what the ad is about, show a clip, etc. So I wonder: the visitor is giving attention to this ad, is viewing its message -- without clicking on it. In essence, the ad agency's objective is accomplished without a click, which would be a significant money saver if PPC is considered. This seems very ingenious on their part, and I wonder how this is handled by Google. Shouldn't there be a fee for a publisher if visitors interact with ads, regardless of clicks? CTR becomes irrelevant in this context. Are you aware of anything being discussed in this respect?

    Read the article

  • Google indexing pages with #! although we don't have any

    - by Benjamin Gruenbaum
    Our company has developed a Single Page Application using AngularJS and its routing. Google indexed our site decently with JavaScript but it did not index some pages very well so we have developed an HTML only version. We have followed the Ajax Crawling Specification posted here and have a <meta name='fragment' content='!'> tag and canonical urls. We expect http://www.example.com/foo/bar to be fetched from http://www.example.com/?_escaped_fragment_=/foo/bar. However, we have found out that when we rolled the AJAX specification we now have all pages indexed twice, once with the JavaScript version as http://www.example.com/foo/bar and once with the new version as http://www.example.com/#!/foo/bar. This is harmful to us since it's duplicate content and also mis-representing out site. I have tried looking for similar questions here and in the Google product forum but could not come up with anything.

    Read the article

  • Change player in javascript game [migrated]

    - by KLUSTER
    Game: onClick startbutton mathrandom for first player who starts the game. 4 Pictures: 2 of it player1 and player2. another 2 Player turn. need help: on button click next player turn function game(){ var PlayerTurn; PlayerTurn=parseInt(Math.random()*2); if(PlayerTurn==0){PlayerTurn=1;window.document.player1.src="Cache/Player3.PNG";} else{PlayerTurn=0;window.document.player2.src="Cache/Player4.PNG";} } Any help is appreciated.

    Read the article

  • Is this the place to ask SEO questions [on hold]

    - by user39206
    I'm adding business listings to an existing website of which Google has sort of been referencing well but my Meta tags have a lot to be desired! Due to other work commitments I have kind of neglected the site but now think I should put the proper time and effort into getting it just right as it does earn me money. I am a developer so I have all the necessary skills to build the site, it's almost done!. Now I'm just a little worried that I could do something wrong and loose the rapport I've built up with Google in the past or be ban. Especially with the new tags I see like: <link rel=”publisher” href=”https://plus.google.com/[YOUR BUSINESS G+ PROFILE HERE]“/> Really I just want to know for now, is this the place to ask SEO question?

    Read the article

  • Connect divs with (non-straight) lines [migrated]

    - by Snailer
    I'd like to develop my site with a layout that looks somewhat like houses with connected plumbing, or multiple computers connected to a network. Basically, the will be boxes floating in space, with lines connecting some of the boxes. I'd like these lines to have some turns in them as well (just simple 90 degree corners) rather than just a straight line. My question is what is the best way to achieve this, and perhaps a small example. My thoughts were to use: PHP and CSS: I could create a background grid and then, with some complicated algorithms, draw paths using the grid's borders. This would be more dynamic, but I'm not sure I can plot the math all by myself. just CSS: Perhaps this is as simple as making some pre-drawn lines like L-shapes and T-junctions, then just placing and scaling them. But I don't believe there's a way to scale an image by slicing it.. so the line width would be scaled and thus each image would look different. Any thoughts?

    Read the article

  • Ok to use table for calculator? [closed]

    - by max
    I'm a php/mysql guy, and have been trying to brush up on my frontend skills. So this weekend I made a four function calculator in javascript. But when I started to work on the presentation, I found myself adding extraneous markup just to achieve what a table tag naturally does. Just so we're on the same page, this is the intended layout: 789+ 456- 123x c0=/ It it possible to generate this grid using neither a table, nor extraneous markup? Thank you.

    Read the article

  • curl to itself behind firewall

    - by xtreaming
    I have a server A which is configured behind a firewall and has 30.x.x.x public adress and 172.x.x.x internal address. I'm trying to make a php Curl call from a script located on that server, to the 30.x.x.x external IP of that server but the curl call cannot be resolved. It seems that server A does not have a route to that IP. Did you encounter any similar situations? Any chance to solve it through static routes?

    Read the article

  • How can I prevent my site from being branded a "content farm?"

    - by Fredashay
    I'm building a small social Q&A site. Another Q&A site that I use was recently branded by Google as a content farm and removed from Google results. I know what Wikipedia says is the definition of a content farm (low quality paid articles and spammy text across the page to catch search engines). That other site I use doesn't do those things, so there must be more to it than that. I want to make sure I don't do anything that causes Google to think my Q&A site is a content farm. What should I do, or avoid doing in designing my site layout?

    Read the article

  • How does one redirect from one wordpress page to another via htaccess?

    - by jchwebdev
    I tried making the following change to my wordpress site to permanently redirect a common link to a new page. I could've -sworn- that this used to work. But it simply does not (at least in WP 3.9). I have had to resort to using a Redirect Plug-In. I'm wondering -why- it doesn't work and if there is a technique which -will- work. I'd prefer to continue to use .htaccess for simplicity. Below is the .htaccess file: # MY CHANGES Redirect 301 http://mysite.com/gigs http://mysite.com/booking/ # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> Again, it works by using a redirect plug-in inside WP, but there must be a way to force the 'redirection' to occur -before- the URL is passed to the WP engine, right? How is the done?

    Read the article

  • Disallowed images in the robots.txt of my Joomla site can't be displayed when shared in Facebook

    - by opk
    I have noticed that since I have disallowed images using the robots.txt in my Joomla site, when sharing an article in Facebook, the image will not be displayed. Why is that? Is it indeed related? My robots.txt file: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/

    Read the article

  • Redirect subdomain to local pc

    - by user1188570
    I have a home webserver which is constatly running. Is it possible to create a subdomain which would redirect traffic to another local pc? For example I have 1 Server and one notebook(with webserver installed for developing) Now I can access to notebook only from local network with IP. Server is also hosting domain example.com. Now I would like to visit laptop.example.com, which would be my laptop.

    Read the article

  • How should I design my website to allow posterity to edit?

    - by SSumner
    I'm building a website for a student organization I am involved in at my college. Most of the site will be static - i.e. won't change from year-to-year, but certain pieces will. I am high-tech, but most of the others aren't, and I am graduating in the spring. So how should I go about building the website so as to allow those that take over in subsequent years to edit information? Examples: Events: I already plan on using a Google calendar for this Officers: There will be profiles/pictures for all the officers on the web page Connections: Partnerships with other organizations that we have currently, but may not in future, or may add more in future Should I use some form of CMS (Content Management System)? If so, how restrictive are they (e.g. Drupal) to what you can build and then how easy is it to edit. What other ways could I make a very nice-looking website but allow certain pieces to be edited later?

    Read the article

  • How long before Google will update search terms matching my website?

    - by Camran
    I have a website which title I changed about a month ago. The website is a classifieds website which is dynamic, using php. The title changed from "Free classifieds" to "buy and sell free classifieds". The strange part is that after about two weeks the title showed in google search results changed to the new title, BUT when I searched for "buy and sell free classifieds" my website didn't show up at all. I mean I have gone through over 30 pages of search results and my site isn't listed. However, searching for "free classifieds" still display my website at the same position it was before the title change. Any reason for this? How patient should I be? FYI the website has a sitemap submitted and updated, good meta tags and is W3 valid etc etc, so that is not the problem here. Thanks

    Read the article

  • What is wrong with this HTML5 <address> element? [closed]

    - by binaryorganic
    <div id="header-container"> <address> <ul> <li>lorem ipsum</li> <li>(xxx) xxx-xxxx</li> </ul> </address> </div> And the CSS looks like this: #header-container address {float: right; margin-top: 25px;} When I load the page, it looks fine in Chrome & IE, but in Firefox it's ignoring the styling completely. When I view source in firefox it looks like above, but in Firebug it looks like this: <div id="header-container"> <address> </address> <ul> <li>lorem ipsum</li> <li>(xxx) xxx-xxxx</li> </ul> </div>

    Read the article

  • Remove Border From Smiles in Post [migrated]

    - by komp smith
    Hello i am finally getting to grips with CSS after about 4 years of picking it up as i go. This problem though has had me stumped for a few hours now so ive gave up and decided to ask for some help and learn from it that way. All the smilies in my site have the img border that is for comment images. examples here- http://onlinebanter.com/node/5334 Ive already removed the border with border:none at other places in my website but i cant seem to change this. Could anyone suggest something for me? thanks

    Read the article

  • Why would urls submitted in google webmaster tools drop to 0?

    - by ambient
    Why would urls submitted in google webmaster tools drop to 0? It's a small site, only like 20 pages, I submitted the xml sitemap and for about a week it said 20 urls submitted. A day or so ago it indexed about 17 of the pages, but today when looking it not only says that 0 are indexed but also 0 have been submitted. I did a site search on google and found clearly that pages are indexed, is this just an error on google webmaster tools? Any help or thoughts would be appreciated. Thanks!

    Read the article

  • Unable to debug javascript?

    - by linkme69
    I’m having some problems debugging an encoded javacscript. This script I’m referring to given in this link over here. The encoding here is simple and it works by shifting the unicodes values to whatever Codekey was use during encoding. The code that does the decoding is given here in plain English below:- <script language="javascript"> function dF(s){ var s1=unescape(s.substr(0,s.length-1)); var t=''; for(i=0;i<s1.length;i++)t+=String.fromCharCode(s1.charCodeAt(i)-s.substr(s.length-1,1)); document.write(unescape(t)); } I’m interested in knowing or understanding the values (e.g s1,t). Like for example when the value of i=0 what values would the following attributes / method would hold s1.charCodeAt(i) and s.substr(s.length-1,1) The reason I’m doing this is to understand as to how a CodeKey function really works. I don’t see anything in the code above which tells it to decode on the basis of codekey value. The only thing I can point in the encoding text is the last character which is set to 1 , 2 ,3 or 4 depending upon the codekey selected during encoding process. One can verify using the link I have given above. However, to debug, I’m using firebug addon with the script running as localhost on my wamp server. I’m able to put a breakpoint on the js using firebug but I’m unable to retrieve any of the user defined parameters or functions I mentioned above. I want to know under this context what would be best way to debug this encoded js.

    Read the article

  • What are the minimum steps that I should follow to ensure that my web site is accessible to the disabled?

    - by Tim Post
    I am trying to follow a very important standard that I must admit I have ignored up until recently. I want to make sure that my pages are accessible to a large portion of people that have disabilities. I focus mainly on tutorials that are text and image intensive, but no video / flash or any kind of animations. What is a checklist that I can follow to ensure that many people with disabilities can have a good experience when using my web site, and what disabilities should I be most conscious of? I know that I can't possibly please everyone. I have gone through the W3C guidelines, however I'm not entirely sure what standards apply to me. I'm not building web applications, I'm building mostly wiki like information exchanges, blogs and the occasional forum.

    Read the article

  • How to Automate Checking for Stolen Content?

    - by Hisoka
    So I know about tools like Copyscape and Google Alerts.. great tools, but it's quite tedious for me to copy and paste an URL or phrase for every one of my pages in my sites. Is there any tool out there that monitors your website and emails you or alerts you whenever someone has stolen content from your site? The only service I know is CopySentry and honestly, it's too expensive for me since I got thousands of pages I want to monitor... Anyone else have this problem? or is it just me? Thanks for any help.

    Read the article

< Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >