Search Results

Search found 9728 results on 390 pages for 'meysam pro'.

Page 116/390 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • Why google is not crawling my website

    - by Aman Virk
    I am running a design and development blog http://www.thetutlage.com/ . From last couple of days my search traffic have been reduced from 70% to 10%. I myself is against black hat seo and all it do is write my own unique content almost everyday. Last week my search traffic was really good but now is dropping like heck. I have checked my webmasters dashboard and no message there from google. When i checked server logs i came to know last time google crawled my website was on 27 september 2012. Really i have no idea what i am doing wrong. I follow all google guidelines like bible, please help me

    Read the article

  • Why is Google Webmaster Tools crawling invalid URLS and showing 500 errors?

    - by Amos Kane
    Google Webmaster tools is reporting 12k+ 500 errors. Eeek! None of the URLS are valid- they all contain www.youtube.com. First, why is Google crawling these URLS if they don't exist? I supplied a sitemap, and they are of course not in the sitemap. I don't have a robots.txt blocking anything. I've checked for invalid redirects--none, and checked for unclosed tags or something that would throw www.youtube.com into the URL by accident--none. In every 'linked from', the referring URL is also a bad URL, with www.youtube.com in it. The Google Tools report no malware, and I can't check the server logs because the host won't give me access. Really stuck!! Any ideas appreciated!

    Read the article

  • At what visitor share do you stop supporting a given browser?

    - by adam
    I'm lead dev for a large website which has a higher than average percentage of IE6 users - about 4.4% of our audience. Our new version is going to make use of progressive enhancement - including transitions and effects as well as rounded corners, gradients, web fonts and other CSS techniques. Obviously there are cross-browser ways to achieve most of these things which require various amounts of work to implement. What I'm currently looking into - and what I'd like your experiences of - is how to decide at what point we draw the line between providing an enhanced experience vs just supporting the functionality. FYI, I believe that this question meets the six guidelines for great subjective questions as defined in the FAQ. I'm after answers detailing why and how, not too short, with constructive comments, experiences, facts and references. Thanks! Adam

    Read the article

  • Is real estate boom again in India?

    - by skzameen
    The present real estate scenario in India is very good. The real estate boom in India is interlinked directly to the industrial, Commercial and economic growth with stability and strong presence of international companies throughout India have made the preferred destination for investment in real estate sector. India is a one of the fast growing economic stock markets For more information about residential and commercial projects or properties log on to www.zameen-zaidad.com, or email to [email protected] Contact Us Zameen-zaidad.com Ph: - +91-11-40024002 M: - +91-9810445860 Share your opinion for www.zameen-zaidad.com portal

    Read the article

  • Google webmaster tools / Geographic location settings

    - by JochemTheSchoolKid
    I am building an website. It has an .nl domain. Now only my domain is showing up on google.nl I hope I can change this somehow that it could be findable in all google's (like google.com / co.uk) and so on. If I look on google forums. They say go to webmaster tools and change your geographic position over there. But I have added this site and I am not able to change it there because there is no select box. I dont have any idea were to search (yes I searched on google offcourse) or where to ask for this special problem. So maybe here can someone redirect me or explain me what is possible and what not. The question is can I make an .nl domain findable in (almost) all google search sites? And so on how can I do that. Picture of my google webmaster tools (nl): http://i.stack.imgur.com/ZuP4L.png

    Read the article

  • Forum engine with full LDAP integration [closed]

    - by Andrian Nord
    We are looking for forum engine which may actually maintain user data into LDAP, maybe via mods. Core point is about ability to maintain the data, i.e. all user profile settings, like nickname, password, email, avatar, birthday and others (preferably configurable). One example of good ldap integration, level of which I'm expecting, is drupal's ldap integration, which allows to map any user's attribute into ldap and keeps it in sync with database. Year ago I've done a small research over existing Free&FOSS engines and find out few forum engines with LDAP integration, namely SFM, phpBB and something else. The most maintained solution were provided by phpBB3, which supports LDAP integration out-of-box, but it is unable to sync data with changes in LDAP server made by other software. Actually it wasn't even propagating changes back, I'm not saying about ability to map additional attributes (other than name/password/email). Also, I haven't found any forum with architecture which have proper abstraction over user settings, thus I doubt that this engines (including phpBB) are possible to mod such functionality without introducing dramatic changes into core codebase. More recent research showed that even some commercial software, like IPB is unable to keep it's database synced with LDAP directory and map additional attributes. In other words, all support I've seen so far is simple user creation upon first user's login, which is not good for us, as forum is not primary site and should not maintain it's own users base (to reduce risk of possible collisions). LDAP import is required due to many other services (ftp, email, jabber, drupal site) using same users base. Currently we have forum embedded into Drupal site, but we are unsatisfied with it's features. BTW, we are using Linux and this is not duplicate of this question, as it's author seems to be satisfied with behaviour described above. So, my question is: Are there any (preferably FOSS&free) forum engines that may import, export, keep in sync, or otherwise integrade with LDAP user database (preferably with ability to map additional fields to ldap attributes)?

    Read the article

  • Rel = translation

    - by Tom Gullen
    I can't find much online about rel="translation" We have tutorials and manual entries which we are going to get users to translate. If the original page in English is: http://www.scirra.com/tutorial/start And there are two translations: http://www.scirra.com/tutorial/es/start (spanish) http://www.scirra.com/tutorial/de/start (german) How would I correctly link all these up? I'm aware at the top of the page I would need to specify the correct IS639-1 code: <html lang="de"> But I'm more interested in letting Google know they are not duplicates but are translated.

    Read the article

  • 403 error on index file

    - by John L.
    When I try to access index.py in my server root through http://domain/, I get a 403 Forbidden error, but when I can access it through http://domain/index.py. In my server logs it says "Options ExecCGI is off in this directory: /var/www/index.py". However, my httpd.conf entry for that directory is the same as the ones for other directories, and getting to index.py works fine. My permissions are set to 755 for index.py. I also tried making a php file and naming it index.php, and it works from both domain/ and domain/index.php. Here is my httpd.conf entry: <Directory /var/www> Options Indexes Includes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all AddHandler cgi-script .cgi AddHandler cgi-script .pl AddHandler cgi-script .py Options +ExecCGI DirectoryIndex index.html index.php index.py </Directory> Thanks

    Read the article

  • Does Bing support anything like Google's First Click Free program?

    - by Dan Fabulich
    Google has a program for webmasters called First Click Free. To implement First Click Free, you need to allow all users who find a document on your site via Google search to see the full text of that document, even if they have not registered or subscribed to see that content. The user's first click to your content area is free. However, once that user clicks a link on the original page, you can require them to sign in or register to read further. The user must be able to see the full content of a multi-page article. You can allow this by displaying all content on a single page to both Googlebot and users. Alternatively, you can use cookies to make sure that a user can visit each page of a multi-page article before being asked for registration or payment. Does Bing support anything like this?

    Read the article

  • What data to send when tracking clicks with Google Analytics events (and how)?

    - by user359650
    When tracking clicks on links, there are 3 items I'm interested in: link location in the page by grabbing the id of the closest parent: to see influence of location on click-through link text: to see influence of text on click-through link href attribute value: to see where people go when leaving my website The problem when using Google Analytics to track those clicks is that events only have 3 available text fields, one of which being the category, which if you use to store one of the above items will create a mess in your Event reporting because you will have as many categories as item values. Therefore if you assign a predefined value to the category (e.g. clicks), then you're left with only 2 event fields (action, label) to store 3 items (location, text, href). That in itself isn't the end of the world because you can concatenate 2 items into 1 event field, then use the reporting or the API to filter things out. Accordingly what I plan on doing is this: category: clicks action: {location_on_page} ¦ {text} label: {href} where {__} are variable values related to the clicked links With this I can easily create some reports directly via the GUI: downloads: include only events where label ends with .pdf click outs to particular domains: include only events where label contains domain And for more complex tasks I need to export the data (or use the API): influence of location on clicks: for each location in the design, count number of events that have that location in the action, then corroborate with pageviews of the corresponding pages. Whilst this looks good I'm wondering if there is a better approach, hence the following questions: Q1: Can you foresee any particular issues with this particular setup (e.g. things I won't be able to report on)? Q2: Can you think of other data that would be interesting to include in the event?

    Read the article

  • Default Wordpress site on IIS

    - by Mike
    We have multiple wordpress installations on our IIS7 (Windows Server 2008) Server as follows: http://www.example.com/site_one http://www.example.com/site_two http://www.example.com/site_three These all work properly. However we would like to configure it so that when users visit the root domain (http://www.example.com/) or any page underneath, ie: http://www.example.com/ http://www.example.com/page1 http://www.example.com/page2 They would actually see the corresponding pages for site_two: http://www.example.com/site_two/ http://www.example.com/site_two/page1 http://www.example.com/site_two/page2 How could we achieve this?

    Read the article

  • What is a generalized form creator that runs on .NET / Windows?

    - by Josh
    At the institution that I'm at, we've been looking for web applications that enable users to create and deploy their own forms. Similar applications are Wufoo, and google forms. Unfortunately, those solutions will not work for us, because we are required to host all data and information on our own servers. I've found a few solutions that are written in PHP, but at this point, it doesn't appear that this is acceptable. I've tried searching for ".net form creator" but unfortunately, when you search for ".net forms" you get a lot of results relating to created asp.net webforms, which is not what we're looking for at all. I've been told that finding a solution that runs on .NET and windows servers with either Oracle or MSSQL databases would be much more acceptable. I've found a few, but they are open source, and the IS Security people are not kind to those solutions, despite my attempts to show otherwise. If anyone knows of some solution out there, I would greatly appreciate you passing on the names of those applications!

    Read the article

  • Email links open in a new window [closed]

    - by Dan
    I'm asking this as an opinion question. How does everyone treat email links opening in a new window if their default email client is web based? This way? <a href="mailto:[email protected]">email me</a>. It will open fine for app based email clients but open in the same window for web based clients. This way? <a href="mailto:[email protected]" target="_blank">email me</a>. It will open in a new tab for web based email clients but open a blank tab. I cant really seem to find the best of both worlds. What does everyone else do?

    Read the article

  • Recording custom variables to identify individual users with Google Analytics

    - by mrtsherman
    I have been asked by our marketing department to add Google Analytics custom variable tracking to my company's website. As the website uses server side includes, modifications to the tracking tag roll out globally - maintenance is therefore a headache! So, if I add the following code (keeping in mind SSI so every page has the same code): // visitor level tracking, id = 12345 // Record a unique id for each visitor. When they return also track this id _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking // If the user signs up for our newsletter we set newsletter to true // Each page they subsequently visit should also mark this as true _gaq.push(['_setCustomVar', 1, 'newsletter', 'true', 1]); I don't use GA and the marketing people don't use custom variables, so we don't actually know how or if this will work. Therefore my questions are:- Do I want Page, Session or Visitor level tracking? What happens when the same code is used on every page? Can GA 'overwrite' a setting. For example, if I set newsletter to true on page X and then user navigates to page Y, will the variable also be marked there?

    Read the article

  • How often does Dreamhost change IP Addresses

    - by pjreddie
    So I just migrated our site to dreamhost because they are free for non-profits. However, right after I switched the nameservers over to them they changed the IP address of the site. So first they propagated out IP address x.x.x.180, then they switched it to x.x.x.178 and had to propagate that out. Point being it meant a lot of downtime since a lot of big DNS servers (like google) thought the address was still x.x.x.180 for up to 5 hours after they switched it. This is compounded by the fact that most our visitors to the site live here in Unalaska and we have local DNS servers that take a LONG time to update (like a day or more) since we get all our internet over satellite. So every time Dreamhost changes our IP address it can mean a day of downtime for us in our community. So my question is, how often do these changes take place? I asked Dreamhost support and they gave me a vague response: I wish I could say, however those changes happen at random times. They're not that frequent, maybe even months between updates, but there's no way to know for sure. First, I hardly believe that they don't know their own system well enough to give me at least some estimate or average. Second, is it worth looking at other providers so that I can get a static IP address? We were hosting the site here originally and hadn't run into this problem since we have a static IP here. We don't get a ton of traffic but usually around 500 hits a day or so, sometimes more if our stories are featured on statewide or national news broadcasts. So hours of downtime every time Dreamhost "randomly" decides to move our server location can be bad for our readership.

    Read the article

  • Authorize.net CIM or using the module's storage

    - by CQM
    this site is intended to allow users to sign up and pay for a service, they will be able to pay using Paypal and Authorize.net since I am using two different payment gateways, it makes me wonder where I want to keep the user information. Authorize.net offers CIM, but some users will pay with paypal therefore Authorize.net won't have all user's information Would the best solution then be to not use CIM and store everything within my member database module? for the record I am using OSE for Joomla for my subscriber service

    Read the article

  • Photo gallery for Blogspot blog

    - by Django Reinhardt
    I don't think this is entirely possible, but here we go: I have a friend who has a Blogspot blog. He has posts with text, posts with videos and posts with photos... and he was wondering if there's any way to turn the posts which are just photos into a thumbnail gallery screen on his blog. So for example, let's say he has 20 photo posts on his blog with the label Skiing Holiday 2009 (horrible example, I know). Is there a way of having a post created for his blog that displays those photos as thumbnails, linking through to their full size versions? I just don't think it's possible, but I'm really hoping someone will be able to offer a solution (or even a place where I could find a solution). Thanks a lot.

    Read the article

  • Cost effective way to provide static media content

    - by james
    I'd like to be able to deliver around 50MB of static content, either in about 30 individual files up to 10MB or grouped into 3 compressed files, around 5k to 20k times a day. Ideally I'd like to put some sort of very basic security around providing the data to ensure that a request is from the expected source, but if tossing the security for a big reduction in price is possible then it's an option. Does anyone have any suggestions other than what I've found: Google AppEngine is $0.12/GB & I believe has a file size limit of 10MB so I'd have to break the data up a bit. So a rough calculation would seem to be that this would cost me about $30 to $120 a day. Or I've seen something like what seems to be just public static content delivery with no type of logic capabilities like Usenet.nl at what I think calculates to about $0.025/GB which would cost me about $6 to $25 a day. Any idea if I'm going about these calculations right & if there might be a better option for just static content on a decently high volume delivery? Again some basic security would be great but if cost is greatly reduced without it then I'm up for that.

    Read the article

  • Strange robots.txt - how and why did it get there?

    - by Mick
    I recently created a very simple, pure HTML website which I have hosted with "hostmonster". Hostmonster had very good reviews on some comparison website and in general so far they appear to be perfectly good in every way... At least I thought so until just now... I have been making lots of edits to my site on an almost daily basis. My site now appears on the first page (7th on the list) for my most important keyphrase when doing a google search. But I did notice some problem with the snippet chosen by google. I asked a question on this site about snippets and got some great answers. I then made some modifications to my meta data and within 48hrs the google snippet for my search was perfect. The odd thing though was that looking at the "cached" version google had, it appeared that the cache was still very odl- like three weeks previous. This seemed very odd - how could it be that the google robots had read my new metadata without updating the cache? This puzzled me greatly. Just now it occurred to me that maybe I had some goofey setting in my robots.txt file. I didn't actually remember even making one - but I thought I'd have a look just in case. Much to my horror, I saw that there was a robots.txt and it contained the disturbing text below: sitemap: http://cdn.attracta.com/sitemap/728687.xml.gz Intuitively this looks like some kind of junk, spam trick, and I had indeed been getting some spam from "attracta". So my questions are: 1. Should I simply delete this robots.txt? 2. Was the file there all along - placed there because of some commercial tie-in between attracta and hostmonster. 3. Does the attracta robots file explain the lack of re-caching?

    Read the article

  • Why do people crawl sites without downloading pictures?

    - by Michael
    Let me show you what I mean: IP Pages Hits Bandwidth 85.xx.xx.xxx 236 236 735.00 KB 195.xx.xxx.xx 164 164 533.74 KB 95.xxx.xxx.xxx 90 90 293.47 KB It's very clear that these person are crawling my site with bots. There's no way that you could visit my site and use <1MB bandwidth. You might say that there's the possibility that they could be browsing the site using some browser or plug-in that does not download images, js/css files, etc., but the simple fact of the matter is that there are not 90-236 pages that are linked from the home page (outside of WP files), even if you visited every page twice. I could understand if these people were crawling the site for pictures, but once again, the bandwidth indicates that this isn't what is happening. Why, then, would they crawl the site to simply view the HTML/txt/js/etc. files? The only thing that I can come up with is that they are scanning for outdated versions of WordPress, SQL injection vulnerabilities, etc., which makes me inclined to outright ban the IPs, but I'm curious, is it possible that this person is a legitimate user, or at the very least, not intending to be harmful?

    Read the article

  • Making profit from a social network

    - by James P.
    This follows similar questions but I'd like to see if anything particular comes out of it due to the nature of site. In short, I've taken up the role of webmaster for a small social network site and wish to make it profitable to at least cover the running costs. The site is linked to a commerce and presents are offered to members according to the number of points they've accumulated through various actions. The site is running on shared hosting so it's probably dirt cheap but the presents can be expensive as a whole and some money has already been invested into the project. One idea I have is to seek some sponsors that would be willing to offer presents or special offers in return for publicity. I don't know if this will be easy or not. I'm also looking into adapting hosting to perhaps move static files to a cheaper online storage medium (see Ideas for reducing storage needs and/or costs (lots of images)). Other suggestions are welcome.

    Read the article

  • Which image sharing websites supports file uploading dynamically via api

    - by KoolKabin
    Hi, I have been searching for image hosting website that displays images of a user in a nice and managed way. I want to upload the files to that image hosting website in my account of that website from a page in my website. i.e if i have a website abc.com then user browse my website abc.com. Uploads the file to my website. Now I want to transfer the uploaded file to the image hosting website so that it can be viewed by other users of that hosting website and get better visibility to world

    Read the article

  • Research about best way to present multiple products on one page

    - by Michael Dibbets
    I am updating a webshop page. This is a fairly simple page that displays all the products that we currently sell. The page in development is visible here ( https://www.ortho.nl/wwebshop ). Now I was curious, and since I can't find anything via google etc..(probaly don't know the right keywords) what the best way is to present multiple products on one page. Should you use borders? Should you use colours? Which colours? what kind of tweaks will direct the customers attention to the right place? Does anyone know from experience or via research(and could you point me in the right direction to find that research) what the best way to present products is so conversion/clickthrough is optimised?

    Read the article

  • Looking to trade a 1U HP Proliant DL360 G5 in exchange for a small linux VPS

    - by user597875
    I have a 1U HP Proliant DL360 G5 that I have no place to rack and would like to trade it for a small linux VPS. If interested let me know... Here are the specs of the server: Model: Intel Xeon CPU 5150 @ 2.66GHz, 4MB L2 Cache Processor Speed: 2.7GHz Processor Sockets: 2 Processor Cores per Socket: 2 Logical Processors: 4 8GB of memory 4x72GB 10k SAS drives Manufacturer: HP Model: Proliant DL360 G5 BIOS Version: P58

    Read the article

  • Increase traffic to a site through a site on subdomain [closed]

    - by user1716672
    Possible Duplicate: Subdomain versus subdirectory We have two sites, one is mainly a portfolio site (built with Yii framework) and the other is a digital shop (built with open cart) where we sell plugins and themes. The url's look like www.mydomian.com and www.store.mydomain.com. But of these sites are in the same server. We use google analytics tools and have no problem getting traffic to our store. But we have very little to our portfolio site and we want to increase our Google ranking for this site. Assuming increased traffic to our site will increase our google ranking, we were thinking to use URl masking so the link will be www.mydomain.com/shop and this will load www.store.mydomain.com. Will this count as hits for our portfolio site? Because the .htaccess rules will ensure the subdomain is served. So I dont know if these hits will count on our store or our portfolio site...

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >