Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 90/389 | < Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >

  • Hide email adress with JavaScript

    - by Martin Aleksander
    I read somewhere that hiding email address behind JavaScript code, could reduce spam bots harvesting the email address. <script language="javascript" type="text/javascript"> var a = "Red"; var t = "no"; var doc = document; var b = "ITpro"; var ad = a; ad += "@"; ad += b; ad += "."; ad += t; var mt = "ma"; mt += "il"; mt += "to"; var text = ""; if (text == null || text.length == 0) text = ad; doc.write("<"+"a hr"+"ef=\""+mt+":"+ad+"\">"+text+"</"+"a>"); </script> This will not display the actual email-address in the sourcecode of the page, but it will display and work like a normal link for human users. Is it any point of doing this? Will it reduce spam bots, or is it just nonsense that might slow down performance of the page because of the JavaScript?

    Read the article

  • Multi database link and mix and match email alert

    - by menardmam
    I have a site which is a large database of people that have different knowledge in different domains, such as teaching (maths, french, science etc...) On the site there is a page where you can search people base on different request, such as distance from home, grade, sex. Now, I would like to add a page where people that are looking for mentor will fill a request, and when a tutor in his area of search will match request, a email will be send to this researcher. Because I know for sure, that when in January you look for a math teacher for your 10 year old son, and you find none, you won't go again in February, March... and on and on just to see. Maybe there is one now, you want to be informed when the tutor will get into database automatically (more or less like www.jobboom.com) So the question is, what CMS do I need to be able to do that ? Wordpress, drupal or something custom made?

    Read the article

  • Only 192.168.0.3 can request, but anyone can request /public/file.html

    - by mattalexx
    I have the following virtual host on my development server: <VirtualHost *:80> ServerName example.com DocumentRoot /srv/web/example.com/pub <Directory /srv/web/example.com/pub> Order Deny,Allow Deny from all Allow from 192.168.0.3 </Directory> </VirtualHost> The Allow from 192.168.0.3 part is to only allow requests from my workstation machine. I want to tweak this to allow anyone to request a certain URL: http://example.com/public/file.html How do I change this to allow /public/file.html requests to get through from anyone? Note: /public/file.html doesn't actually exist as a file on the server. I redirect all incoming requests through a single index file using mod_rewrite.

    Read the article

  • How to support email subscriptions to many rss feeds

    - by peter
    I am interested in having the option to be able to subscribe to any of my RSS feeds by email, without having to manage any of the email lists. Are there any email delivery services that allow easy subscription to arbitrary feeds? MailChimp's api doesn't allow list creation. The closest I can come up with is linking people to a google alert: http://www.google.com/alerts?q=site:mysite.com/category/food http://www.google.com/alerts?q=site:mysite.com/category/drinks

    Read the article

  • Can search engine robots read file with permission 640?

    - by dkjain
    I am on a shared web hosting linux server. I want search engine robots/spiders to be able to read the robots.txt but not any one typing www.mysite.com/robots.txt. As per the following google group post, the user specifies that by setting file permission to 640, it's possible to deny access to robots.txt file by the world but still enable search engine robots to read them. Is that true? If not how it's possible to deny general public access to robots.txt but still allow Search engine robots to read them.

    Read the article

  • Managing user privileges, best practices [on hold]

    - by Loïc N.
    I'm am new to web development. I'm creating a website where different user can have different privileges, such as creating/editing/deleting a news, or adding/editing/deleting whatever kind of content on the website. I started by creating a "user type" that would indicate the user's privileges (such as "user", "newser", "moderator", "admin", and so on), but I quickly started noticing issues that made me think that this might be a naive approach to this issue. What if I want to give a regular user the right to edit a news (for whatever reason)? Then the user would be half "user", half "newser". But the system I use can only handle one user-type. So what would be the best practice here? I was thinking of removing the concept of roles (or "user-types" such as newser) and only have the concept of "privilege", where every user could have zero to many privileges. So, to re-use the above example, if I wanted a user to have the right to edit some news, I would only have to give him a "edit news" privilege. Is this the way to go?

    Read the article

  • Does redirect popup window affect SEO?

    - by Joseph
    We have multiple websites, each site servers number of countries, and we used to have Geo-Ip Auto redirect system (no one likes auto-redirect), so we implemented another redirect system also uses Geo-IP database, but showing a pop-up window (HTML layer pop-up, so it can't be rejected), this window asks the visitor if he would like to continue with this page or go to the correct website of his country. We also added a test line before showing the pop-up, so if the visitor is Googlebot, the popup will not show up :). I was wondering if this effects our websites SEO?

    Read the article

  • Are there sources of email marketing data available?

    - by Gortron
    Are sources of email marketing data available to the public? I would like to see email marketing data to see what kind of content a business sends out, the frequency of sending, the number of people emailed, especially the resulting open rates and click through rates. Are businesses willing to share data on their previous email marketing campaigns without divulging their contact list? I would like to use this data to create an application to help businesses create better newsletters by using this data as a benchmark, basically sharing what works and what doesn't for each industry.

    Read the article

  • How do I make sure the web developer I hire will not steal my idea?

    - by Greg McNulty
    So I have a great idea for a new website. However, not the time to develop it. I would like to hire a person or company to design it for me. What steps do I need to take, to protect my idea? Where and how do people protect website ideas in general? Also, how easy is it for someone to tweak the idea and make it legally heir own? Is a patent enough to protect such a thing, idea. Are there different levels or types of protection? Thank You.

    Read the article

  • 410 Responses when your CMS host doesn't support them?

    - by leeand00
    Sending a 410 responses for a page that no longer exist should make Google stop crawling for that page. The site I am working on has been recently migrated, and very little of the content was migrated. I've already turned the existing content into 301 redirects (the content that is on both the old and the new site), but now I would like to flush the old content from Google's memory by placing 410 responses in it's path when it returns to crawl for them and finds a 404 response. However, I asked our CMS host about it, and they said that our CMS does not support 410 responses. Is there some other way to post a 410 response, like making a dead link 301 redirect to a page that a 410 response in the form of a meta tag?

    Read the article

  • Adsense click bot is click bombing my site

    - by Graham
    I have a site that get's roughly 7,000 - 10,000 page views per day right now. Starting around 1 AM on 7/1/12 I noticed the CTR was rising dramatically. These clicks would be credited then de-credited soon after. So, they were obviously fraudulent clicks. The next day I had about 200 clicks in account with about 100 of them being fraudulent. It's about 3 - 8 per hour evenly dispersed for each of the three ads 24 hours a day. This leads me to believe that it's some sort of Adsense click bot. Also, I removed the ads last evening then put them back up around 3AM and the invalid clicks started within 10 minutes. I signed up for statcounter.com to analyze the exit links on the Adsense. Then I conditionally blocked ads for the IP address of the person / bot I suspected doing this. But, I think that the bot has several proxies to choose from and can refresh IP addresses. I've notified Google through the invalid click form / email 4 times over the past two days in order to let them know I'm aware of the situation and am working on a solution. I've also temporally removed all ads on that site. How can I block a bot like this? Thank you.

    Read the article

  • web hosting locally

    - by Pradyut Bhattacharya
    i have made a website and hosted in my local computer using a static ip where can i buy a domain name such as www.something.com such that it can redirect to my static ip so that if i m using a page like a http://localhost/index.jsp it can be accessed by http://www.something.com/index.jsp does it matter if i run the server locally or i buy a managed web hosting server from a big company if the traffic is low on my site?? thanks

    Read the article

  • Joomla and Google Analytics advanced options in tracking code

    - by miako
    I want to insert google analytics tracking code in my joomla site. so i registered in the official site of google and saw there is an advanced tab with three more options than standard. Do i have to check "i want to track dynamic pages" and "i want to track php pages"? Do these options provide me better results or they are necessary for a dynamic site based on php like joomla? Does anyone know the process of installing? because i didn't manage to make it work by following this Also where do i place the tracking code? Because of some bugs some say it is better just after the tag <body> whereas other say just before the tag </body>. Thank you

    Read the article

  • php image upload library

    - by Tchalvak
    I'm looking to NOT reinvent the wheel with an image upload system in php. Is there a good standalone library that allows for: Creating the system for uploading, renaming, and perhaps resizing images. Creating the front-end, html and/or javascript for the form upload parts of image upload. I've found a lot of image gallery systems, but no promising standalone system & ui libraries, ready for insertion into custom php.

    Read the article

  • 500 error on Joolma website

    - by Rachel Sparks
    PHP Fatal error: Call to a member function setQuery() on a non-object in /home/josh/public_html/administrator/components/com_jfusion/plugins/phpbb3/forum.php on line 226 Just moved over to a new server. Anyone have ideas as to what is wrong? Is this a database issue? line 226: //get permissions for all forums in case more than one module/plugin is present with different settings $db = & JFusionFactory::getDatabase($this->getJname()); $query = "SELECT forum_id FROM #__forums WHERE forum_type = 1 ORDER BY left_id"; $db->setQuery($query); //226 $forumids = $db->loadResultArray();

    Read the article

  • Singular or Plural Nouns as file names for better Search & SEO friendlyness? [closed]

    - by Sam
    Possible Duplicate: Should I use singular or plural nouns in a domain name and why? Dear folks, two scenarios where file names should be best representing the search volume by audiences searching for it. Scenario 1 website.org/en/logo.php website.org/en/brochure.php website.org/en/poster.php website.org/en/design.php OR Scenario 2 website.org/en/logos.php website.org/en/brochures.php website.org/en/posters.php website.org/en/designs.php Q1. What do you intuitivly think would be the best? Q2. What do the facts in general show? people search for singular or plural in search? Q3. Do Search engines have common rule of thumb for this? Q4. Should I pick either and go with either scenario consistently or does it depend on the word? Thanks very much for your ideas/suggestions. I reall don't know which one to go for.

    Read the article

  • Where can I find comscore rank?

    - by Joyce Babu
    Recently one ad network rejected my registration stating that my site doesn't match their minimum monthly impressions, even though the site serves thrice the required page views. When I contacted them for details, their representative hinted that they are using comscore data for screening submissions. Where can I view my site's comscore ranking and details? Update I was able to find the traffic by tagging my site with comScore Direct.

    Read the article

  • Can .htaccess slow down a site?

    - by Cody Sharp
    I'm working with a client on an e-commerce website. I implemented clean URLs using .htaccess. I also used .htaccess to solve canonical issues such as redirecting www to non-www and removing index.php from the URL. The website recently began to slow down dramatically, sometimes not even loading. The site is hosted on GoDaddy, and when the client called GoDaddy they told him it was the .htaccess file slowing down the website. I find this highly unlikely because of my past experiences, but I'm not 100% sure. My thinking is that the client's website is most likely on a shared server with a busy neighborhood, thus slowing down the site. It's not always slow, but rather sporadic throughout the day, loading fast at some points and slow at other points in time. Can the .htaccess file slow down a website to a crawl? If so, are there better ways to solve these problems with different rewrite rules and such? Here is what the actual .htaccess file looks like: Options +FollowSymlinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.example.net [NC] RewriteRule ^(.*)$ http://example.net/$1 [L,R=301] RewriteRule ^products/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=product&product_code=$1 [L] RewriteRule ^catalog/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=catalog&catalog_code=$1 [L] RewriteRule ^pages/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php?p=page&page_id=$1 [L] RewriteRule ^index\.htm([l]?)$ index.php?p=home [L] RewriteRule ^site_map\.htm([l]?)$ index.php?p=site_map [L] RewriteCond %{QUERY_STRING} ^p=home$ RewriteRule (.*) ? [R=permanent] I'm a .htaccess and regex novice, so any pointed out mistakes would also help. Thank you.

    Read the article

  • Has anyone been able to convert a site's media content from flash to html5?

    - by Muhammad
    I'm looking to convert a site from Flash based video to HTML5, the current video uses time marks to display slides (kind of like how youtube has ads on their videos). But the difference between Youtube and my site is that it doesn't show up inside the video, the slides are displayed next to the video. Is there any way I can accomplish this with HTML5? Or do I have to use Javascript for this? If this isn't clear enough, please let me know.

    Read the article

  • Weird referral traffic [closed]

    - by Noam
    Possible Duplicate: Strange incoming links appearing on site statistics I'm getting weird traffic from Japan, from a site called ime.nu Why weird? because I'm not able to identify the link and also when going to their homepage link it just shows an Apache Test Page, while I'm seeing it is a pretty big site in analysis sites (Alexa Ranked 121 in JP) Can someone help me understand the mystery?

    Read the article

  • SSL setup with GoDaddy subdomains and EC2 servers

    - by Kevin
    We have two EC2 instances that are used to host various scripts. Our main page 'companyname.com' is hosted with GoDaddy but is unrelated to those EC2 instances. I need to setup SSL connections for the two EC2 microinstances, one running Linux AMI and the other running Windows Server. I purchased two single-domain Comodo certificates and am at the part to generate CSR's on the instances. I'm not sure what to put as "Server Name" on EC2. I would like each server to be accessible through a subdomain which I have forwarded on GoDaddy to the elastic IPs on EC2. For server name, do I use the elastic ip, the EC2 public dns, or the subdomain that I want? And which of these do I then place in my VirtualHosts file on Apache? The Windows instance is running IIS7 but the Apache box is priority.

    Read the article

  • Prevent hotlinking of attachments

    - by reggie
    People are able to embed my forum's attachments (vbulletin). I tried to create an htaccess rule for the hotlinking, but it did not work. RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?mydomain\.com.*$ [NC] RewriteRule attachmentid=\d+(\&d=\d*)?|\.([Gg][Ii][Ff]|[Jj][Pp][Gg])$ http://mydomain.com/antihotlink.jpeg [R] Is it not possible to check for numbers in regular expressions in htaccess files?

    Read the article

  • Do AdWords conversions only track AdWords visitors?

    - by atticae
    I have set up an AdWords conversion and put the conversion tracking JS code on a test page. However, I don't see any conversions tracked when I visit it. Does the AdWords conversion tracking only register your conversion if you come to the site by clicking on an AdWords campaign? Google's help page advises me to test the code by clicking a campaign. However, I would like to use the tracking to track all conversions, not just AdWords. I considered using Analytics as well, but it seems you can only track via url there, not JS, which would mean I had to restructure a part of my page. (Because currently a conversion appears does not necessarily happen on a different URL.) Is AdWords tracking not a viable solution to track all visitor conversions on my site?

    Read the article

  • URL rewrite from www.domain.com/sudirectory to http://domain.com/subdirectory

    - by chrizzbee
    I need a solution for the following problem: I use a CMS and want the backend only be available at http://domain.com/backend and not at http://www.domain.com/backend. How do I have to change my .htaccess file to achieve this? I already have a rewrite rule from HTTP (non-www) to www. Here's what I currently have in my .htaccess file: ## # Uncomment the following lines to add "www." to the domain: # RewriteCond %{HTTP_HOST} ^shaba-baden\.ch$ [NC] RewriteRule (.*) http://www.shaba-baden.ch/$1 [R=301,L] # # Uncomment the following lines to remove "www." from the domain: # # RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] # RewriteRule (.*) http://example.com/$1 [R=301,L] # # Make sure to replace "example.com" with your domain name. ## So, the first bit is the redirect from HTTP to www. It works on the domain part of the URL. As explained, I need a rewrite rule from the backend login at http://www.shaba-baden.ch/contao to http://shaba-baden.ch/contao

    Read the article

  • How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

    - by UXdesigner
    Good day, I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results. I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service. Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears: Date: Wednesday, July 20, 2011 9:43:35 AM PDT Googlebot Type: Web Download Time (in milliseconds): 55 HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 403 Forbidden Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing. thanks,

    Read the article

< Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >