Search Results

Search found 9757 results on 391 pages for 'shekhar pro'.

Page 222/391 | < Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >

  • Bad links point to old domain - should I disavow on new domain?

    - by user32573
    I am working with a site which we'll call www.newdomain.com, which was hit by Penguin this month despite no unusual practices. I found lots of really spammy links to their old site, www.olddomain.com, which 301s to the new domain. So I've gone through the process of identifying which links are really bad, made contact to ask for removal, and am at the stage of disavowing links. But wait! None of the bad links point to newdomain.com, and I worry that a disavow request via this domain in Webmaster Tools will damage something. Do the old band links affect the new site? If so, where do I disavow those old bad links? On Webmaster Tools for the new domain?

    Read the article

  • Website with over 1 million posts with not much textual content

    - by Far Se
    I've made a website which crawls files from all over the Internet and I feel like Google will ban me if I sent it sitemaps which contain all of these pages (1m+), because they contain only the file name/size/no of downloads and the download link(s). I'm considering this thought because I've made another website like this in the past and Google banned me after one week with the reason: "spam", even it was not (maybe somebody falsely reported me?!). Does someone have an idea about how to keep Google form banning my website? I've seen several other sites like mine and they don't get banned or... anything. And also, should I sent the sitemap or wait until Google indexes every page as it finds them? Thanks in advance :)

    Read the article

  • Include latest searches in search engines index

    - by drcelus
    My websites generally include a page with the (user input) latest searches. I know it's not a good security practice to allow this since you can find undesired content. On the other hand it boosts the number of pages indexed since every new search can provide a link on google and people can find you with related keywords that you are not using on your web page. What is the rationale behinf including or excludingthis results in search engines index ?

    Read the article

  • Building an intranet

    - by WernerCD
    I'm researching for a project I'm going to be doing at work on the side... I work for a small hospital and we recently upgraded all the browsers inside our intranet to IE8 (Goodbye 6 :). We have a small, obsolete intranet built by someone who isn't a web designer... functional enough, but annoying to maintain and really sparse. What I'm wanting to do... is use a good framework. I'm looking for suggestions... I'm looking for something Windows IIS based. I'd love windows authentication - with the ability to delegate sub-sections of the website to managers. Right now it's my job to add/update/delete anything from the site... I'd like something not complicated that can be delegated to non-technical people. Like... the Cafeteria Manager should be able to update the menu without putting a ticket into me. She'd log into her computer, open the intranet (which would use her windows log-on to identify her) and have elevated privileges to edit her section of the intranet. If I have to "extend" a good framework to get Windows Authentication, I'll do it... but I'd prefer it to be baked in. What are some good frameworks, tools and places to start? While this isn't a "Huge" project... it's going to be bigger than the basic stuff I've done before and I'd like a good place to start.

    Read the article

  • How should I setup billing for AdWords when managing a client's campaign in My Client Center? [closed]

    - by Dustin
    I have worked with Google AdWords before and will now be managing an AdWords account for a client. I have a My Client Center account, but I'm wondering what the best practices are for billing. Should I link billing to my own credit card and then have the client pay me (they have to pay me to manage the account anyway), or should I have the client pay Google directly? How is this usually done? If it is the later, what is the best way to have them input their payment info?

    Read the article

  • Most standard / Best way to keep the same top menu among different web pages?

    - by jsoldi
    What's the standard way to keep the same menu on top among different web pages without having to duplicate it on each page (I don't mean that it doesn't reload like when using frames and only loading the bottom part; I want the menu to scroll with the page when scrolling down, like this, this, this and pretty much every single web page that exists). I found this answer but the guy can't use Php and I can. Plus, I see several people giving different suggestions, but I assume there is a standard since pretty much every single web page in the whole web have a menu on top that stays the same among multiple pages . I'm just a newbie on web design (I can program Php and Html easily but I have no idea about standards and stuff like that since I'm self-taught guy ;)). What I would normally do is to include the menu with php but I'm not sure if this is the "standard".

    Read the article

  • Joomla url issue with sh404SEF

    - by user5858
    it's couple of months I've been using SH404SEF With my site. But in my site I'm getting url's in the form: http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view If I remove this suffix(?task=view), it takes us to the same page. I had raised this issue in SH404SEF forum, and I was told that this data is taken as parameter by search engines hence ignored. I want to redirect using RewriteMatch in .htaccess all such url's to the url's without ?task=view ones : ....downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view to be redirected to http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html So my question is: Will this redirection create 404's in the Google webmaster. I've thousand's of pages in the site

    Read the article

  • Non-dynamic CMS [closed]

    - by user20457
    Some of the web sites I visit every day (news, sports, etc..), although the content changes very often (several times per day), the URLs always have .html extension, what makes me thing that the content has been generated once, and then published as a static page, rather than generated in every call, or even cached in memory. For example, the fictitious site "mysports.com" have a "futbol.html" page, and then yesterday Messi gets injured and they have another thing to put in that page, then I presume they post the new item in their CMS system, and automatically a publishing action is triggered aftewards that recreates "futbol.html" in a CDN with the new item and probably discard the oldest one. Then the ETag changes and clients will get the new page if they try to access it. (the site is fictitious but this is what I believe happened yesterday in the sports site I read) This would fit in the CQRS approach, and I presume they have a huge performance. I know lots of CMS (WP, Drupal, BlogEngine.net, DNN, etc...), but I have never seen any able of doing this, or at least, I was not aware this feautre. How are called those distributed CMS? Which are the most well known? Cheers.

    Read the article

  • Will keep google traffic on new site from old site when moving content from old site? [closed]

    - by user1324762
    Possible Duplicate: new domain, old links are 301’d from old domain to new, how will this affect my rankings? I have a site about bikers. Now I created a dating site for bikers. I don't need old site any more, I want to move all articles to this new dating site. So basically, this is not only moving content to new domain, but also to entire new site. What I am planning to do is to make 301 redirect for all 200 articles. For pages that are not articles, I will just put message that the site will be down soon. Do you think that I will get all google traffic from old site from those articles? Is there anything I should be aware and careful?

    Read the article

  • Segment subdomains with Google Analytics?

    - by andrewpthorp
    So, when a website has multiple subdomains: www.example.com foo.example.com bar.example.com What is the best way to use Google Analytics to segment the data? I would prefer have access to 'All Data', 'Data from foo.example.com', and 'Data from bar.example.com'. I tried setting up 3 different views, and setting a filter on the foo/bar views that says: Include only traffic from the ISP domain that are equal to foo.example.com. However, I am not seeing any data collected into that View. I do, however, see all data in the 'All Data' view, but I can't figure out how to segment the data. I am including the analytics.js in the application.haml layout, which is always loaded in this app. Thanks!

    Read the article

  • Microformats, Reviews and Duplicate Content

    - by Nicholas
    Let's say I have a site that sells widgets, and the URL structure is like so: /[type-of-widget]/[sub-type]/[widget-name]/ So, a URL for a widget might be: /screwdrivers/philips-screwdrivers/acme-big-screwdriver/ We show reviews on the widget page, and use the appropriate microformat data so Google knows it's a review, etc. Now, what if I want to show random reviews in the "sub-type" and "type-of-widget" landing pages? Will Google ding me for duplicate content, or is it smart enough to know (based on microformat data/etc.) that this is not duplicate content?

    Read the article

  • Issue updating domain name servers from BlueHost to AWS

    - by cowls
    I am trying to migrate my site hosting from bluehost to AWS cloud based service. I have the site up and running on AWS with an elastic IP configured, it loads fine when I specify the IP address in the browser. I have gone into Route 53 on the AWS console and created a "hosted zone" for the domain. I then created a new record set of type "A" using the IP address as the value. I have a domain name registered with bluehost. Ive logged into the bluehost account and updated the domain name servers to point to those specified in Route 53 in the AWS console. When I hit the IP address directly the site loads, however it doesn't load when using the domain name (I get a google chrome oops error page saying page is not found) I've tried using this site: http://dns.squish.net/ to debug but it seems to be giving me the correct results. fizaclegems.com 300 IN A 107.20.209.78 Where 107.20.209.78 matches the elastic IP configured in the AWS console. This is the result it gives for all 4 name servers. Am I missing a step here? Does anyone know what else I should be doing or looking for?

    Read the article

  • SEO - Hidden content before main site content

    - by 0pt1m1z3
    I have a two hidden divs before my main site content, one with the login form and another with the signup form. I then have login and signup buttons within the page that use JQuery to show or hide these divs. I like the effect this setup offers, dropping down from the top of the page and pushing the rest of the content down. However, recently I have been getting serious about SEO and I am wondering if these divs have been affecting my SERP rankings. Basically, every non-logged page (everything bots see) has the same two display:none; divs at the top of the document flow. Is it bad? Should I re-engineer these forms and the way they are displayed?

    Read the article

  • QR Codes and Short Links - Please Take A Look [closed]

    - by Joe Turner
    I'm looking for a way to create a QR Code and a shortened link when a form is submitted. I have the QR Code bit, but the link is too long for me and the QR Code looks scary and complicated. The way it works is; the user types in (in this instance) a contract number. Then, a folder is created on the server of that contract number. (www.mysite.com/QR/$contractnumber). Then, using PHP again, I create a QR Code through Google because I know that every QR code will be linking to the same place, just a different ending of the link. The only bit that changes is the $POST... I was wondering if there was a way to shorten the link before it goes to Google? It would have to be through php. The user enters the contact number in the form, then that number(usually around 5/6 digits) will be entered into a already existing command? I'm not an expert in anything, I just know some really random snippets of code... And HTML and CSS, of course. Any help would be appreciated and judging by the few days I have been searching this, I think it might help a few people in the future. I would also like to confirm that the solution can't be one of this visual URLShorteners. If it is, it just needs to be the back-end of it, built into a existing form and QR Generator. Simple?

    Read the article

  • How to CURL and avoid timeout death (Twitter Down) [migrated]

    - by David
    Twitter is down right now, and one of my site's home pages relies on getting data from Twitter (relies is the problem - it should be more of an accessory feature, as it just shows follow count from its feed). Here's the code in question: function socials_Twitter_GetFollowerCount($username) { $method = function () use ($username) { return file_get_contents('https://api.twitter.com/1/users/show.json?screen_name='.$username.'&include_entities=true'); }; $json = cache('bmdtwitter', 3600, $method, false); $json = json_decode($json, true); return intval($json['followers_count']); } What is a good way to make it so if Twitter is down (or not responsive for some reasonable amount of time), our site doesn't appear to be down (I think the timeout maybe defaulting to 30-60 seconds or more).

    Read the article

  • How to build a list from Postfix maillog

    - by dstonek
    I want to build a list from maillog, maillog.x containing something like Date, Sender's email, Recipient's Email and subject of the message filtering output emails and output domain. I've read about importing from spreadsheet program a cvs file. The issue is I have to add field separators in log file. I couldn't find how to customize that. How can I do that, the list and the separator? This is an example of sending mail log Jun 11 15:24:58 host postfix/cleanup[19060]: F41C660D98A0: warning: header Subject: TESTING SUBJECT from unknown[XXX.XXX.XXX.XXX]; [email protected] [email protected] proto=ESMTP helo=<[192.168.1.91] Jun 11 15:25:01 host postfix/smtp[19062]: F41C660D98A0: to=, relay=mx-rl.com[xxx.xxx.xxx.xxx]:25, delay=3.4, delays=0.66/0.01/0.86/1.9, dsn=2.0.0, status=sent (250 <538E30D9000A1DD8 Mail accepted) The list would contain the three bold fields filtering by to = [email protected]

    Read the article

  • How to Stop Browser from rejecting my downloads

    - by melki0795
    I have a portfolio site where I am trying to host some of my work, so people can download my work. Some of these files include exe executables, and some are jar executables, which are run through batch. When a user tries to download my apps, it says that the file is not commonly downloaded and may be harmful, and therefore blocks the download. If I zip the folders, it still does the same thing. Any format i choose, still blocks the downloads. How can I stop chrome from doing this. Is there a way I can verify my files so they will be considered as trusted? Thanks in advance!

    Read the article

  • How to batch remove spamming users and pages they created on MediaWiki?

    - by Problemania
    I'm trying to clean up a MediaWiki instance which has been subjected to spamming and vandalism for a period of time. The current status is that there are a large number of users which only created spam pages but typically not altered legitimate pages. And there is only < 10 users which I know are legitimate users and created a small number of legitimate pages. Abstractly, my idea of fixing the messy situation is to find the complete list of users that are not in that small set of legitimate users, and use RenameUser extension to rename them all to a Spammer user, and use Nuke extension to mass delete all pages it created. Any practical advice on how to proceed? Since there are hundreds of spammer users, how do I effectively rename them? It seems Renameuser extension does not support automated batch renaming of users by allowing users to be renamed with a list or file.

    Read the article

  • Online job application

    - by Fred
    I am trying to add an application to my site where I can post job openings with my company and allow people to apply online. Can someone recommend a service or app already in existence for this purpose? I tried googling it, but could not find a set of search terms that did not return endless sites for job seekers. This is a (very) small business and I do not expect to have more than a few openings at any time, but what I am actually interested in is having a repository of interested job seekers to have on file. Then when people ask me about openings, I could just refer them to the page and they could apply. Then, if we have an opening, we could look through the list of candidates and if we can't fill the position(s) from that list, we could post the job and advertise to fill the position.

    Read the article

  • Most common Apache and PHP configuration for portable Web Applications

    - by Mahan
    I always create web application using PHP but I always distribute and deploy my works to different kinds of server platforms and web server configurations. Thus I always encounter problems in deployment because some features are enabled and others are disabled. And my question, is there a standard web server configuration that is commonly used by most of web servers worldwide? covering the aspects of reliability, security and maintainability?

    Read the article

  • Paypal "Subscribe" button: Is it possible to let the subscriber set the amount?

    - by Šime Vidas
    I'm setting up a recurring payment option on my website. I'd like to have two options: Option 1 (for individuals): Fixed $6/mo subscription Option 2 (for organizations): A subscription where the amount is set by the subscriber PayPal's "Subscribe" button does not seem to allow that: When I leave the "Amount" field of the 2nd option empty, I get an error: So, is this not possible? Do all options require fixed amounts?

    Read the article

  • Bug? Flash of white when changing orientation on iOS Safari [migrated]

    - by Baumr
    What causes the flash of white to the right of a responsive design when changing orientation from portrait to landscape on iOS? Try it on iOS6 Safari: Websites like this don't do it: http://html5boilerplate.com But this one does: http://www.initializr.com Something to do with re-processing (CPU lag) to fit a wider screen? It doesn't happen in Chrome for iOS6... Update: I just removed all img and from my testing site, but it still happens. This seems to happen with a lot of different websites out there. Is it a bug with their code, or a Safari for iOS bug? Others are completely immune to it...

    Read the article

  • RewriteRule for URL Subdirectory Root

    - by JYerdon
    Have not found this in my searches on SE. I need this scenario to work: • User visits someurl.com/news/folder or someurl.com/news/somefolder/, they get redirected to someurl.com/somefolder. • If the user visits JUST someurl.com/news or /news/, they are allowed through to visit /news. Here is my current rule: RewriteRule ^news/(.*) /$1 [NC,R=301,L] How do I make it allow the second bullet point? First seems to work with no issues. Thanks all! POST UPDATE I have got the code RewriteCond %{REQUEST_URI} ^news RewriteRule ^/news news/ [NC,L] RewriteCond %{REQUEST_URI} ^/news/(.)$ RewriteRule ^news/(.) /$1 [NC,R=301,L] BUT - it doesn't allow me to go to the URL something.com/news/ Any thoughts?

    Read the article

  • What factors help in getting a site indexed by Google fast?

    - by ekaj
    What should I do to get a site indexed on Google, fast? Taking example of Super User I just did a quick Google for a question that was 20 minutes old, to look for an answer, and it was already on Google Search - how is this possible? I glanced over this article which seems to suggest that SU has added RSS feeds (which SU has, but when I opened the feed the article says last posted 6 minutes ago, but when Googled it is 11 hours old) - which leads me to think (Based on that article, I don't know much about search indexing but I am reading at the moment) that most of this indexing is done thanks to the sitemap. is there anything else I am unaware of that helps SU questions get on Google so fast?

    Read the article

  • Handle php out of memory error

    - by PeterMmm
    I have a Drupal based web site on a relative small vserver (512MB RAM). Recently the website begins to return php out of memory messages like this: Fatal error: Out of memory (allocated 17039360) (tried to allocate 77824 bytes) in /home/... All php.ini memory limit parameters are set to off (-1). Propably the website has gained of complexity, content, etc. But I cannot interpret fine that message: Does that mean that the whole request has allocated 17MB(?) right now and cannot get 7MB(?) more from the OS. Has the web server spend all memory or has the OS no more memory to allocate ? I'm not shure if the memory overhead is coming from the web server or another service, because when I get the out-of-memoy message I can't get into the server with ssh. After a while all runs fine again.

    Read the article

< Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >