Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 250/528 | < Previous Page | 246 247 248 249 250 251 252 253 254 255 256 257  | Next Page >

  • Restricting A Directory Through .htaccess

    - by Whitechapel
    I'm trying to put all of my FTP accounts into a folder on /public_html/ftp and password protect it so search bots can't crawl their private files. I'm also trying to redirect all site traffic from the non-www to www. I keep getting 500 errors when accessing the site, and I need to point it to www.vivalanation.com/ftp to www.vivalanation.com/ftp/, because the /ftp just errors out, you need the trailing slash. Here is my .htaccess in the /public_html/ftp folder: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] AuthName "FTP Access" AuthType Basic AuthUserFile /home1/vivalst/.htpasswds/public_html/ftp/passwd Require valid-user I created a passwd file in /.htpasswds/public_html/ftp And here is my basic .htaccess in the root of /public_html/: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

    Read the article

  • Paypal "Subscribe" button: Is it possible to let the subscriber set the amount?

    - by Šime Vidas
    I'm setting up a recurring payment option on my website. I'd like to have two options: Option 1 (for individuals): Fixed $6/mo subscription Option 2 (for organizations): A subscription where the amount is set by the subscriber PayPal's "Subscribe" button does not seem to allow that: When I leave the "Amount" field of the 2nd option empty, I get an error: So, is this not possible? Do all options require fixed amounts?

    Read the article

  • Wordpress login area for downloads

    - by user2248809
    I need to create a page that requires users to log in with a username / email and password to access it, and then depending on who the user is, they get links to one more files they can download. No need for a 'register' page - users will be added on the back-end. Can anyone recommend the best approach for this? Are there good plugins to handle this kind of thing? Thanks in advance for any guidance. It's Wordpress 3.8.1 by the way.

    Read the article

  • Form development optimization

    - by Juan
    Like many web developers I do forms all the time. I found myself doing the same all the time: placing input fields, assigning a name to each, ajax the form, then create the PHP which involves to assign a PHP var to each $_REQUEST['var'], escape and validate data, build the html and emailing the results. So I found that 70% of the work is duplicated but I just can't duplicate a page and change the fields. I end up wasting more time reformatting, deleting and adding different fields than creating from scratch. I started planing to program a "list of IDs to html+php" converter in which I'd input all the IDs and this would output the basic html and php. Then I thought: there's got to be thousands of developers that go through this, I'd be reinventing the wheel. So this is my question, I'm trying to find that wheel that somebody must have invented already. I found this: http://www.trirand.com/blog/jqform/ which does more or less what I'm looking for but it's an expensive solution and it has too much functionality for what I'd be using it. Which tools do you use to optimize repetitive task about HTML and PHP?

    Read the article

  • Bayesian content filter for vbulletin [on hold]

    - by mc0e
    I've been tasked with coming up with a tool to automatically flag some posts for moderator attention on a large vbulletin forum. It's not spam per se, but the task has a lot in common with the sort of handling that might be done by a spam protection plugin (a mod in vbulletin speak). There's only so much I can say, but the task does not involve bad users, so much as particular kinds of posts which the moderators need to be aware of. Filtering out user registrations and links is therefore not useful, and we are talking about posts by real human users. What I'm looking for is an existing bayesian classification plugin, or something that I can study to get an understanding of how to do the vbulletin side of the interface in order to build such a thing. Ie I'd need ways for moderators to list flagged posts, and to correct the classification of posts which have been mis-classified. Ideally I want a 3 way split with an "unsure" category in order to reduce what has to be reviewed to find any mis-classifications. Any pointers? I've searched around a bit, and so far what I've found has been more or less entirely targetted at intervening in sign-ups (mostly using stopforumspam), captchas, and use of external services like akismet which are spam specific. I'm also considering an external solution, which might be ableto be interfaced i

    Read the article

  • GZipped Images - Is It Worth?

    - by charlie
    Most image formats are already compressed. But in fact, if I take an image and compress it [gzipping it], and then I compare the compressed one to the uncompressed one, there is a difference in size, even though not such a dramatic difference. The question is: is it worth gzipping images? the content size flushed down to the client's browser will be smaller, but there will be some client overhead when de-gzipping it. Please advise.

    Read the article

  • How to redirect a international domain to a subfolder on the English site without hurting Google rankings?

    - by ernest1a
    I have two sites: www.example.de www.main.com www.main.com is English version of www.example.de which is in German. I want to keep only www.main.com. For the English version I will keep www.main.com, but for German I want to move it to www.main.com/de. I am wondering what would be best solution for old www.example.de: Redirect everything from www.example.de to www.main.com/de using 301 redirect? Redirect everything from www.example.de towww.main.com/de/page-url-of-old-size.html? So each link actually get own address. Is that necessary or will Google realize where the page belongs on new site even if I redirect everything to home page? Any other solution, maybe just set in Google webmaster tools the new domain or anything like that?

    Read the article

  • How to incoporate Taiwan into ISO 3166-1?

    - by Misha
    I received a rather heartfelt e-mail from a user on our site explaining that Taiwan wasn't a province of China and that they wouldn't register until this was changed. Our site is made from a dozen or so valued contributors so we are going to change it. See issue here: http://www.iso.org/iso/country_codes/iso_3166-faqs/iso_3166_faqs_specific.htm. I found the lines to edit in my CMS, but we have valued users from both China and Taiwan. I was wondering the most politically neutral way to amend my country codes. What should the entries for China and Taiwan say?

    Read the article

  • Which tags to use for good SEO on the page

    - by Aaditi Sharma
    I have a event page, where it has the following items. Event Name Venue Name(s) {some cases go upto 5 or more venues} Event Info {Genre(s),Language,type(s)} Date(s) on which the event is. Event Description. Since, the Event name is unique, and present in the title, I am assigning <H1> to it. However, venue names are multiple, plus the same venue may be repeated across the page, along with dates. (Each)Event Info, is used a single time on the page Dates, are descriped in a styled manner using multiple spans, however, I am going to use a title on them. Event description is in <p> tag. So My question is which heading tags to use for a good symentic description and SEO. Also the title on the dates, which format should I keep the date in? (dd/mm/yyyy)?

    Read the article

  • SEO problem for site with 2 domains [closed]

    - by Harry
    Possible Duplicate: What is duplicate content and how can I avoid being penalized for it on my site? I have two domains pointing to the same site. I want both domains to co-exist, they share most of the same content, but they differ in design and they are aimed at different markets / rivaling communities. Is there a way to let google know that these two domains are the same site and don't cause me to get hit with a duplicate content penalty? Any other general SEO tips for this situation would also be welcomed. Thanks. Come on man, why was this closed. The linked page is completely irrelevant for me.

    Read the article

  • MediaWiki plugin for dynamic content via forms

    - by Geek42
    Are there any plugins for MediaWiki that would allow me to create a page that has a form at the top that when filled in populates tags further down in the document? Say someone would put a form with "Source Server:" and "Destination Server:" fields at the top. Once those were typed in it would automatically populate those names into the content lower so that when following the instructions you could just read the docs and not have to mentally replace things, and possibly mess it all up. I'm not looking to make pages that are permanent, just ones that can have values entered before they are followed. Any suggestions?

    Read the article

  • Will we be penalized for having multiple external links to the same site?

    - by merk
    There seem to be conflicting answers on this question. The most relevant ones seem to be at least a year or two old, so I thought it would be worth re-asking this question. My gut says it's ok, because there are plenty of sites out there that do this already. Every major retailer site usually has links to the manufacturer of whatever item they are selling. go to www.newegg.com and they have hundreds of links to the same site since they sell multiple items from the same brand. Our site allows people to list a specific genre of items for sale (not porn - i'm just keeping it generic since I'm not trying to advertise) and on each item listing page, we have a link back to their website if they want. Our SEO guy is saying this is really bad and google is going to treat us as a link farm. My gut says when we have to start limiting user useful features to our site to boost our ranking, then something is wrong. Or start jumping through hoops by trying to hide text using javascript etc Some clients are only selling 1 to a handful of items, while a couple of our bigger clients have hundreds of items listed so will have hundreds of pages that link back to their site. I should also mention, there will be a handful of pages with the bigger clients where it may appear they have duplicate pages, because they will be selling 2 or 3 of the same item, and the only difference in the content of the page might just be a stock #. The majority of the pages though will have unique content. So - will we be penalized in some way for having anywhere from a handful to a few hundred pages that all point to the same link? If we are penalized, what's the suggested way to handle this? We still want to give users the option to go to the clients site, and we would still like to give a link back to the clients site to help their own SE rankings.

    Read the article

  • SEO/Google: How should I handle multiple countries and domains?

    - by Valorized
    Hello. I'm the webmaster of an online shop based in Austria (Europe). Therefore we registered "example.at". We also own different other domain names like "example-shop.com" and "example.info". Currently all those domains are redirected (301) to the .at one. Still available is: "example.net" and "example.org" (and .ws/.cc), unfortunately not available: .de/.eu The .com is currently owned by one of our partners, the contract ends in 2012 but until then we have no chance to get this one. Recently I read more about geo-targeting and I noticed ONE big deal. The tld ".at" is hardly recognised in Germany (google.de) whereas it is excellently listed in Austria (google.at). As a result of the .at I cannot set the target location manually (or to unlisted). More info: https://www.google.com/support/webmasters/bin/answer.py?answer=62399&hl=en This is a big problem. I looked at Google Analytics and - although Germany is 10x as big as Austria - there are more visits from Austria. So, how should I config the domain in order to get the best results in both, Germany and Austria? I thought of some solutions: First I could stop redirecting the .info. Then there would be a duplicate of the .at one. Moreover, in Webmastertools, I could set the target location of the .info to Germany. As the .at still targets Austria, both would be targeted - however I don't now if google punishes one of them because of the duplicate content? Same as 1. but with .net or .org (I think .info is not a "nice" domain and moreover I think search engines prefer .com, .net or .org to .info). Same as 1. (or 2.) but with a rel="canonical" on the new one (pointing to the .at). Con: I don't think this will improve the situation, because it still tells google that the .at one is more important, like: "if .info points to .at, the target may still be Austria". rel="canonical" on the .at pointing to the new (.info or .net or .org). However I fear that this will have a negative impact on the listing on google.at because: "Hey, the well-known .at is not important anymore, so let's focus on the .info which is not well-known." - Therefore: bad position in search results. Redirect .at to the new (.info or .net or .org) with a 301-Redirect. Con: Might be worse than 4, we might loose Page-Rank (or "the value of the page", because google says that page rank is not important anymore). Moreover this might be even more confusing for the customers. In 3. or 4. customers don't get redirected, they do not see the canonical-meta-tag. So, dear experts, please tell me what the best option would be! Thank you very much for your advice in advance and please excuse the long question. I really appreciate this network! Please note: It's exactly the same content AND language. In Austria we speak German.

    Read the article

  • Why I lose my page rank after 301 redirect?

    - by rajesh.magar
    As we all know Google treats sub-domains as completely separate domains so we have to fight for both, to get ranked in search results. One of my client website was like they having example.com and blog.example.com. So in mind to keep all stuff in one place we redirect blog.example.com to example.com/blog/ But in this case we lost our pagerank and are still wondering where we went wrong or it just takes few more time to showoff. So what is the reason behind this?

    Read the article

  • Adding tags for SEO in clothing website [duplicate]

    - by samyb8
    This question already has an answer here: What are the best ways to increase a site's position in Google? 18 answers I am building a site for a women's accessories brand. The site has a Homepage, a Store page (where all accessories are displayed), a page for each of the accessories description, an about page and a contact page. There is also a whole set up for shopping cart and checkout (irrelevant to this question). My issue is the SEO. Where can I put the keywords? The home page has only the menu and some photos. The store page displays the items and its titles. Then the specific item's page has a description of the item (pulled from the database), category and price. However, I feel like this is not enough for SEO for google ranking. Where could I add tags in this type of site?

    Read the article

  • How to keep google rank and index for a page that changed its url? [closed]

    - by ProSoft
    Possible Duplicate: How to tell Google that I have changed my website URLs? Recently, I changed URL of my web page. Of course, I do it by URL rewriting. And now, I want to keep the rank of this page in Google and Bing. For example: Main address of the page: http://mywebsite.com/page1.php Virtual address by URL rewriting: http://mywebsite.com/page And new address is: http://mywebsite.com/newTitlePage Now, when I open this page by search in Google, I face to 401 error (not found). How should I do it?

    Read the article

  • Where to get ads for my website ?

    - by Divyanshu Negi
    I am the developer of the website named viewloud Now as my website is getting around 100 visitors per day so i was thinking that i should put some ads on my website but it is really very hard to find the best advertising plan which can benefit me the most. Google adsence . Is it a good choice ? Will google adsence allow me to open a account there with such little traffic on my website. ? Please help from last two months i have done a very hard work to bring such tarffic :p i know it is very less but i am still working on it .. so please help me guys. Thank you Divyanshu

    Read the article

  • How to push through a domain transfer in spite of the 60 day rule

    - by corsiKa
    I recently purchased a domain through a registrar which I won't name here. Within the first five minutes of logging in, I found a severe vulnerability that allows me access to all registration details of all users. Simply put, I do not trust this registrar with any kind of business. But I'm unable to transfer the domain because, for some reason, it has to exist in its current state for 60 days. We're planning to launch the site this weekend - we can't wait 60 days. But I can not trust this registrar: if I found such a severe vulnerability in the first few minutes, how many more similar un-trustables will I find in those 60 days? Is there a higher authority to whom I can submit a case to get my domain transferred to a different registrar?

    Read the article

  • How to Keep SEO Score from Dropping with Duplicate Content

    - by joeh0717
    I'm hoping that someone has a solution for what I'm trying to accomplish. I'm working on a travel agency web site and there's a "Overview" section for each cruise line. These overviews are located on the index page for each cruise line. Here's my issue: The company is creating a search engine that includes details on each cruise line. Their write-ups on each cruise line are great, so I'd like to include the overview they created for each cruise line, rather than having to create all new ones. However, I don't want duplicating their content to negatively affect the SEO scores of the pages they originally put this content on. It's gong to duplicate, since each page that's dynamically generated by their search engine is going to include a section about the cruise line (where I'd want to place the overview). Question: Is there any way that I can include these overviews (ideally, copying the exact HTML that they've already implemented) without the search engines indexing those particular code sections? I'd want the rest of the search result pages to be indexed...just not the section of each page that contains this duplicate code. I saw something about using a span class named robots-nocontent in Yahoo (not sure if this also applies to Bing) and googleon / googleoff tags in Google. Is this the best solution? I'm open to any suggestions, thanks!

    Read the article

  • How should I study a competitor's off page SEO?

    - by Chris Adragna
    What are the things I need to do, and with what tools to know what a competitor has working for him/her off-page (free and paid tools -- please suggest both)? First of all, I'm supposing I want to see all of the sites linking in and see what anchor text is used. Is there something that would report on the anchor text linking in, such as counting the keyword phrases used as anchor text? Also, it would be helpful to see where the PR is coming from by PR, such as, listing inbound links by PR of the page linking in. Lastly, if I'm missing something, here in the way of off-page attributes, please say so.

    Read the article

  • How to enable the user to add background images to anchor links thought Wordpress admin panel? [closed]

    - by janoChen
    I have css selectors like this on in my style.css: .jimgMenu ul li.landscapes a { background: url(../images/landscapes.jpg) repeat scroll 0%; } What's the easiest way to enable the user to add background images to anchor links like the ones below? front-page.php: <div class="jimgMenu"> <ul> <li class="landscapes"><a href="#nogo">Landscapes</a></li> <li class="people"><a href="#nogo">People</a></li> <li class="nature"><a href="#nogo">Nature</a></li> <li class="abstract"><a href="#nogo">Abstract</a></li> <li class="urban"><a href="#nogo">Urban</a></li> <li class="people2"><a href="#nogo">People</a></li> </ul> </div> To illustrate: .jimgMenu ul li.landscapes a { background: url(<add background image>) repeat scroll 0%; } What that code would look like?

    Read the article

  • redirect url ending with dot

    - by Michael
    I submitted my site's URL to my workplace's printed newsletter and when I get the printed version, they added a dot to the end of it. Some people will realize that the period is not a part of the url but others will not. Is there an easy way to redirect from http://example.com/home. to http://example.com/home? I have a IIS 7.0 shared hosting with GoDaddy. This means I have access to the box only through their interface so some options might be limited.

    Read the article

  • How to add SMS text messaging functionality to my website?

    - by jessegavin
    I want to add the ability to send reminders to people via email and SMS for specific events that they have signed up for on a web application that I am building. The email part is not difficult, but I am wondering where to find a good solution for sending SMS messages. It would also be a plus if this solution allowed two-way SMS communication with my web application so that people would be able to reply with a CONFIRM or CANCEL type of a message. Has anyone implemented something like this? Does anyone know of good tools out there? EDIT: I am realizing that this is more of a "lots of ways to skin this cat" type of question and so I changed it to community wiki.

    Read the article

  • What effect does using itemprop="significantLinks" on anchors have for SEO and what is the ideal use?

    - by hdavis84
    I'm practicing application of microdata via http://schema.org. Anyone who's browsed the documentation there knows that there's a lot of need for improvement for more clear understandings on use for each property. My question on this post is more about the "significantLinks" property and how it effects SEO for on page, in content anchored text. Does anyone have any more information regarding whether its good to use for link optimization? I understand what schema.org means that it's to be used on "non-navigational links" and those links should be relevant to the current page's meaning. But will using this property hurt SEO or make SEO better for each page?

    Read the article

  • SEO-meta description crawling issue [duplicate]

    - by user3707382
    This question already has an answer here: Meta Descriptions not working for google search 3 answers i have following code where i m including my title and description for the page But google crawled only title not the meta description from the code. Where as meta description was read from the keywords present in html of the page.. Please guide me guys where i m coding wrongly <!DOCTYPE html> <html> <head> <title>title inserted here</title> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"> <meta name="description" content="description here"/>

    Read the article

< Previous Page | 246 247 248 249 250 251 252 253 254 255 256 257  | Next Page >