Search Results

Search found 9757 results on 391 pages for 'shekhar pro'.

Page 81/391 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Removing Duplicate Data From SQL Query Output For Display On A Web Page [migrated]

    - by doubleJ
    I had asked a similar question on stackoverflow but didn't really get anywhere. This page shows the output that I'm currently getting from my MSSQL server. I have a table of venue information (name, address, etc...) that our events happen on. Separately, I have a table of the actual events that are scheduled (an event may happen multiple times in one day and/or over multiple days). I join those tables with this query: <?php try { $dbh = new PDO("sqlsrv:Server=localhost;Database=Sermons", "", ""); $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION); $sql = "SELECT TOP (100) PERCENT dbo.TblSermon.Day, dbo.TblSermon.Date, dbo.TblSermon.Time, dbo.TblSermon.Speaker, dbo.TblSermon.Series, dbo.TblSermon.Sarasota, dbo.TblSermon.NonFlc, dbo.TblJoinSermonLocation.MeetingName, dbo.TblLocation.Location, dbo.TblLocation.Pastors, dbo.TblLocation.Address, dbo.TblLocation.City, dbo.TblLocation.State, dbo.TblLocation.Zip, dbo.TblLocation.Country, dbo.TblLocation.Phone, dbo.TblLocation.Email, dbo.TblLocation.WebAddress FROM dbo.TblLocation RIGHT OUTER JOIN dbo.TblJoinSermonLocation ON dbo.TblLocation.ID = dbo.TblJoinSermonLocation.Location RIGHT OUTER JOIN dbo.TblSermon ON dbo.TblJoinSermonLocation.Sermon = dbo.TblSermon.ID WHERE (dbo.TblSermon.Date >= { fn NOW() }) ORDER BY dbo.TblSermon.Date, dbo.TblSermon.Time"; $stmt = $dbh->prepare($sql); $stmt->execute(); $stmt->setFetchMode(PDO::FETCH_ASSOC); foreach ($stmt as $row) { echo "<pre>"; print_r($row); echo "</pre>"; } unset($row); $dbh = null; } catch(PDOException $e) { echo $e->getMessage(); } ?> So, as it loops through the query results, it creates an array for each record and ends up like this: Array ( [Day] => Tuesday [Date] => 2012-10-30 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Wednesday [Date] => 2012-10-31 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Tuesday [Date] => 2012-11-06 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Fellowship Of Faith Christian Center [Pastors] => Michael & Joan Kalstrup [Address] => 18999 Hwy. 59 [City] => Oakland [State] => IA [Zip] => 51560.0 [Phone] => (712) 482-3455 [Email] => [email protected] [WebAddress] => http://www.fellowshipoffaith.cc ) Array ( [Day] => Wednesday [Date] => 2012-11-14 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Faith Family Church [Pastors] => Michael & Barbara Cameneti [Address] => 8200 Freedom Ave NW [City] => Canton [State] => OH [Zip] => 44720.0 [Phone] => (330) 492-0925 [Email] => [WebAddress] => http://www.myfaithfamily.com ) As you can see, The Ark Church and its associated contact information is duplicated, so when I work with those arrays and output them to the page, I see a bunch of duplicate content. I'd like to remove the duplicate information so that I get results similar to this: The Ark Church Alan & Joy Clayton 450 Humble Tank Rd. Conroe, TX 77305 (936) 756-1988 [email protected] http://www.thearkchurch.org Meetings: Tuesday, 2012-10-30 07:00 PM Wednesday, 2012-10-31 07:00 PM Fellowship Of Faith Christian Center Michael & Joan Kalstrup 18999 Hwy. 59 Oakland, IA 51560 (712) 482-3455 [email protected] http://www.fellowshipoffaith.cc Meetings: Tuesday, 2012-11-06 07:00 PM Faith Family Church Michael & Barbara Cameneti 8200 Freedom Ave NW Canton, OH 44720 (330) 492-0925 http://www.myfaithfamily.com Meetings: Wednesday, 2012-11-14 07:00 PM It doesn't necessarily have to end up like that (I'm not looking for code specific for these results, but a concept of how to not show the duplicated information). I'm assuming that an additional foreach or while will do it, but I haven't figured out any logic that says <?php if ($location == $previouslocation) echo ""; ?>.

    Read the article

  • SEO - Index images (lazyload)

    - by Guilherme Nascimento
    Note:My question is not about Javascript. I'm developing a plugin for jQuery/Mootols/Prototype, that work with DOM. This plugin will be to improve page performance (better user experience). The plugin will be distributed to other developers so that they can use in their projects. How does the lazyload: The images are only loaded when you scroll down the page (will look like this: http://www.appelsiini.net/projects/lazyload/enabled_timeout.html LazyLoad). But he does not need HTML5, I refer to this attribute: data-src="image.jpg" Two good examples of website use LazyLoad are: youtube.com (suggested videos) and facebook.com (photo gallery). I believe that the best alternative would be to use: <A href="image.jpg">Content for ALT=""</a> and convert using javascript, for this: <IMG alt="Content for ALT=\"\"" src="image.jpg"> Then you question me: Why do you want to do that anyway? I'll tell you: Because HTML5 is not supported by any browser (especially mobile) And the attribute data-src="image.jpg" not work at all Indexers. I need a piece of HTML code to be fully accessible to search engines. Otherwise the plugin will not be something good for other developers. I thought about doing so to help in indexing: <noscript><img src="teste.jpg"></noscript> But noscript has negative effect on the index (I refer to the contents of noscript) I want a plugin that will not obstruct the image indexing in search engines. This plugin will be used by other developers (and me too). This is my question: How to make a HTML images accessible to search engines, which can minimize the requests?

    Read the article

  • Submitting a sitemap to take care of inherited Google crawler errors

    - by leeand00
    I have an awful lot of Google Crawler errors (1000 or so) after I inherited a site that the previous owner migrated without moving much of their content. Would generating a map of the current site and submitting it to Google help fix this? Is there any quicker, automated way to eliminate errors other than clicking each and every site error? Note: I have already tried automating this on my own.

    Read the article

  • Migrating from a wordpress.com to wordpress.org blog without harming SEO

    - by kikio
    I've had a Wordpress.com weblog for 3 years. And its pages have a good pagerank and are shown in first search results pages. Because of the limitations, I should migrate to my own WordPress. How to migrate safely with the minimum SEO problems? (I know how to export content in wordpress.com and import it to a new wordpress.org blog.) Note 1: links structure and site design are different on the new wordpress blog. (I don't like wordpress.com links structure :| ) Note 2: as you know, it's not possible to edit .htaccess file on wordpress.com. so I can't use 301 redirects.

    Read the article

  • As webdevelopment is it same to legal issues to make a sex dating sites?

    - by YumYumYum
    Like i have created many other normal sites which are not related to any dating/sexual content. Is it for a developer same rules and regulation while making a sex related dating sites? where people meet together, learn each others, for having a sex relaionship (you know what i mean), having also a feature of webcam sex but not explicitly a porno sites. Does those sites have any special legal terms and condition's for the developers comparing with non sexual/dating sites legal terms and conditions?

    Read the article

  • Best practices when loading images for improving page loading speed

    - by Naoise Golden
    I am working on optimizing a page's loading speed. Here are some analytics: Notice how the images, although only accounting for 65% of the total size (1.1MB), are by far the slowest loading assets: 96% of time. I'd like to know which are the recommended practices on optimizing loading speed, only taking images into account. Some of the techniques we are already applying: image compression images hosted on cookieless domain and CDN spriting everything that can be sprited http headers: keep alive and Expires to one year. Disclaimer: I have gone through the available documentation, I think by focusing on image loading optimization I am not creating a duplicate or a subjective question.

    Read the article

  • Send and Receive SMS from my Website [on hold]

    - by Weng Fai Wong
    The way I plan to use it is: Have people send SMS to a number to vote. On the backend (assuming that the data comes back to my Web server), I will display the voting results on my Web site. After say 10 minutes, I would like to press a button on my Web site so ONE of the people who sent an SMS earlier receive an SMS saying that person is a winner. I'm an ASP .Net developer, so I just need an API to code against. One such company I saw that does this (but is limited to US) is: http://www.twilio.com/sms/ Do you know any international providers that are similar to Twilio SMS? I'm based in Sydney, Australia. I've looked through the discussion here but could not find any International provider that does what Twilio SMS does: How to add SMS text messaging functionality to my website? Thank you.

    Read the article

  • different data with same title and keywords

    - by Junaid Saeed
    here is my scenario i have a website where i redirect my users basing upon the device they were using, lets say a user is visiting from an iPad, i take him directly to the page of iPad wallpapers, the user selects iPad version & i take the user to the gallery of wallpapers where the user can select & download any wallpaper. Every wallpaper is the required resolution, i have my reasons for doing this, now the thing is there are diff. resolution. versions of an image appearing one 5 diff. sections of my website, each having their own view page Now there is only one record in db.table for the image, and basing on the my consistent naming convention of the images, i pick the required image. this means when 5 different pages are generated in 5 categorized sections of the website, due to a shared DB record, the keywords, the titles and every single detail of the 5 pages is same besides the resolution of the image, and the section specific details that the page has and yeah the pages also have different paths like wallpapers.com\ipad-1\cars\Ferrari-dino.html wallpapers.com\ipad-2\cars\Ferrari-dino.html wallpapers.com\ipad-3\cars\Ferrari-dino.html wallpapers.com\ipad-4\cars\Ferrari-dino.html wallpapers.com\ipad-5\cars\Ferrari-dino.html now this is my scenario, How do Search Engines see it and how do they rank it? Is it a Good or Normal or Bad SEO practice? If bad how dangerous it is for my sites SEO? i need your comments on my scenario.

    Read the article

  • rich snippets ignored by google [closed]

    - by Thoir Fáidh
    Possible Duplicate: Why would Google Rich Snippets work for one site author but not another? I'm facing one problem here. I made rich snippets - microdata for the website but google ignores all of them. Here is how it looks like in testing tool . It doesn't detect any errors. I've read that google ignores the microdata in hidden fields. Unfortunately this is partially the case since I use jquery to interact with the contect, but nevertheless it is not hidden everywhere and I believe that google should recognize at least the microdata visible to the user permanently. Am I missing something here? It is now about 3 weeks since I updated website with rich snippets.

    Read the article

  • Duplicating someone's content legitimately & writing HTML to support that

    - by Codecraft
    I want to add content from other blogs to my own (with the authors permission) to help build additional relevant content and support articles I've found useful that others have written. I'm looking into how to do this responsibly - ie, by giving the original content author a boost and not competing against them for search traffic which should go to their site. In order to keep my duplicate content out of search, and to hint to the search engines where the original content is to be found i've implemented: <head> <meta name='robots' content='noindex, follow'> <link rel='canonical' href='http://www.originalblog.com/original-post.html' /> </head> Additionally, to boost the original article and to let readers know where it came from i'll be adding something like this: <div> Article originally written by <a href='http://www.authorswebsite.com'>Authors Name</a> and reproduced with permission.<br/> <a href='http://www.originalblog.com/original-post.html' target='new'> Read the original article here. </a> </div> All that remains is a way to 'officially' credit the original author in the HTML for the search spiders to see. Can anyone tell me a way to do this possibly using rel="author" (as far as I can see thats only good for my own original content), or perhaps it doesn't matter given that the reproduced pages will be kept out of search engines? Also, have I overlooked anything in the approach?

    Read the article

  • URL blocked in robots.txt but still showing up on Google search [closed]

    - by Ahmad Alfy
    Possible Duplicate: Why do Google search results include pages disallowed in robots.txt? In my robots.txt I am disallowing a lot of URLs. Google webmaster tools says there're +750 URL blocked. The problem is the URLs are still showing on Google search. For example I have the following rule: Disallow: /entity/child-health/ But when I search some-keyword + child health the following URL shows up : http://www.sitename.com/entity/child-health/ Am I doing anything wrong? Is is possible for a URL to be blocked using robots.txt and still show up on search results?

    Read the article

  • Does sitewide html refactoring affect Google traffic?

    - by Name
    Good morning, I have recently made a big structural change on my site and the very next day the number of Google impressions went from 75.000 to 3.000, with a proportional drop of traffic from searches. No URLs were changed, neither were the page titles or descriptions. Everything is exactly the same, but different looking, except that it does barely appear on Google anymore. Anybody has a clue to why?

    Read the article

  • 301 url rewrite loop

    - by anyvendetta
    I need to do a 301 rewrite to force all urls to become lowercase i put in htaccess (RewriteMap lc int:tolower in httpd.conf) RewriteCond %{REQUEST_URI} [A-Z] RewriteRule . ${lc:{REQUEST_URI}} [R=301,L] Everything works just fine except to urls with subcategories which in this case are: /category-1256-Product-page-example.html the numer 1256 refers to a "subcategory" So when i try to access /category-1256-Product-page-example.html gives me a loop error message I think another redirect rules are making the loop but dunno how to fix it because are just this urls rewrite rules that don't work with the above rewrite. Rewriterule ^main-site-url/category-([0-9]*)-([-_a-zA-Z0-9]*)\.html$ /subcategories.php?idcategory_main=1&idcategory=$1&category=$2 [L] Rewriterule ^main-site-url/([0-9]*)-([-_a-zA-Z0-9]*)-([0-9]*)\.html$ /file.php?idcategory_main=1&idsubcategory=$1&product=$2&idproduct=$3 [L]

    Read the article

  • Looking for a FormBuilder that gives me all images and sourcecode to my form

    - by Tracy Anne
    Wow, I started my search this morning and didn't think it would be so difficult to find. I'm just tired of spending hours putting together simple html forms in dreamweaver. I'm an enthusiast web developer mostly focused on php and mysql. I hate CSS and HTML and I'm looking for a simple program that will put a form together for me where I can then completely embed the form into my site. I'll do all of the programming to attach it to my database I just need the form and images. I've looked into jotform, wufoo, 123forms etc but it seems like they all want to keep my form on their servers in one way or another. It looked like jotform had a developers version but $450 bucks is a little steep for a part timer like me. Is there no simple software out there that will throw a nice stylized form together for me?

    Read the article

  • Google Webmaster Tools Index Status is 0 but sitemap URL shows indexed

    - by DD.
    I've added my site to Google Webmaster tools www.medexpress.co.uk. The site was submitted a few weeks ago. The index status shows 0 but it shows 6 URLs have been indexed in the sitemaps section. If I search in google I can see that the site is indexed and several pages appear: https://www.google.co.uk/search?q=site%3Awww.medexpress.co.uk&oq=site%3Awww.medexpress.co.uk&sourceid=chrome&ie=UTF-8 My question is why is the index status 0 when the sitemap section shows several indexed pages and also the pages appear in the search engine.

    Read the article

  • Multiple domains and product categories but one company?

    - by Brad
    Hi Guys, My company is expanding online, and we are wondering the best way to go about our eCommerce strategy. We sell a wide range of products of a single material, lets use ceramics as an example. The current competition in our niche online is medium level. We currently have one site selling all our range: ceramicstuff.com However I have just found that ceramickitchenware.com, ceramicbowls.com, etc are currently unregistered, despite quite decent traffic search volume around those keywords monthly. What do you guys think about registering these domains to increase traffic? Would I put a standalone sites on those domains, or do I point them to my main domain? Or do I use them a "micro" sites to offer information, and then link to buy at my main domain, etc? Summary: I'm looking to employ "spammy" type SEO tricks, multiple domains, etc but the key point is I will be generating REAL content, and offering a REAL QUALITY product. How to proceed? Thanks!

    Read the article

  • Attach a Wordpress.org blog to my BigCommerce Store as a sub-domain

    - by user1323814
    I am stuck in a peculiar situation. I have a store on BigCommerce configured with a domain from GoDaddy (mystore.com). I recently created a custom wordpress blog and hosted it on 1and1 hosting (s418783372.onlinehome.us), since bigcommerce can't host Wordpress. Now, I want to use it from a sub-domain of my main-bigcommerece store (models.mystore.com), but it doesn't seem to be working since BigCommerce is the Domain Manager, but GoDaddy is the Domain-Registrar and 1and1 is the host so it doesn't control the domain. I have tried setting up a CNAME record on BigCommerece and when it didn't work asked BigCommerece about it, but they said they can't do anything about it since they aren't the domain registrar and gave me a message saying: The responsiblity to show the name in the browser on the site is up to the server or site admin. The Cname can only get the browser there UPDATE: I succeeded in setting up a CNAME on BigCommerce poinitng to the site at 1and1, but for some-reason, all it gives me is a 404-Not-Found error. I was thinking this is due to a restriction on 1and1, any idea on how to overcome that? Not Found The requested URL / was not found on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. I tried adding a domain on the 1and1 control panel (http://faq.1and1.co.uk/domains/domain_xfers/dns_transfer/4.html), pointing to models.mystore.com, but it isn't letting me add a Sub-Domain, there... UPDATE: I added mystore.com as an external domain and them added models.mystore.com as a sub-domain on the 1and1 hosting Domains panel. And it works :) Thank you all

    Read the article

  • Force SSL and WWW in .htaccess

    - by Stephen
    I'm looking for a way to force SSL and WWW. I've been able to force both separately but together I keep running into redirection issues. The following code works when handling a url in this format: "http://domain.com" and properly redirects to "https://www.domain.com" but when the incoming url is "https://domain.com" it will not forward to "https://www.domain.com" -- Any suggestions? EDIT: it should also send "http://www.domain.com" to ""https://www.domain.com" RewriteCond %{REMOTE_ADDR} !127\.0\.0\.0 RewriteCond %{SERVER_PORT} 80 RewriteCond %{HTTP_HOST} !^www.domain\.com$ RewriteRule ^(.*)$ https://www.domain.com/$1 [R,L]

    Read the article

  • DNS Configuration when registrar and host are two different companies

    - by dclowd9901
    I'm a total noob when it comes to DNS configuration. My client bought a domain through one company and is hosting their site with another (a virtual dedicated server). I can't find anything on the web that explains one's way through this setup. Where do I start? Which nameservers do I use? Which company's zone files do I edit? Basically it boils down, for me that I don't understand which company takes the lead, and which piggybacks off their configuration. Thanks in advance for the help.

    Read the article

  • URL good practice for category sub category?

    - by Seting
    I have developed a application and I need to work for SEO-friendly URL. I have following URL structure: http://localhost:3000/posts/product/testing-with-slug-url-2 and http://localhost:3000/posts/product/testing-with-slug-url-2-4-23 Is this a good practice? If not how can I rewrite it? Ok Ill explain about my applicaiton. My application is based on shopping. if a customer searched for mobiles. it will redirect to url like this http://mydomain.com/cat/mobile-3 3 in the url indicates my database id it is used for further searching After the user reached the mobile page he may need to filter for some brand eg. nokia so my url look lik http://mydomain.com/subcat/nokia-3-2 The integer at the end refers to 3 category id and 2 the brand id My doubt is whether the integer at the end of the url will affect seo ranking.

    Read the article

  • Cannot access Adsense funds after switching to third-party partnership program

    - by Clay
    I had a Google Adsense partnership with $80 in it, but then switched to a different partnership and now can't get my money. When I first started YouTube, I joined the Adsense partnership program. After gaining $80 in my Adsense account, I got an offer to join a third party partnership program called Zoomin.tv. I accepted, and it is paying me monthly now. The problem is that my Adsense account still has the $80 in it, and is not gaining more cash. The Zoomin.tv money is going directly to my PayPal. The payment threshold in Adsense is $100, and you can't make it lower. Therefore, my money is stuck in Adsense and I'd love a solution that allows me to access my money.

    Read the article

  • Canonicalization of single, small pages like reviews or product categories [SEO]

    - by Valorized
    In general I pretty much like the idea of canonicalization. And in most cases, Google explains possible procedures in a clear way. For example: If I have duplicates because of parameters (eg: &sort=desc) it's clear to use the canonical for the site, provided the within the head-tag. However I'm wondering how to handle "small - no to say thin content - sites". What's my definition of a small site? An Example: On one of my main sites, we use a directory based url-structure. Let's see: example.com/ (root) example.com/category-abc/ example.com/category-abc/produkt-xy/ Moreover we provide on page, that includes all products example.com/all-categories/ (lists all products the same way as in the categories) In case of reviews, we use a similar structure: example.com/reviews/product-xy/ shows all review for one certain product example.com/reviews/product-xy/abc-your-product-is-great/ shows one certain review example.com/reviews/ shows all reviews for all products (latest first) Let's make it even more complicated: On every product site, there are the latest 2 reviews at the end of the page. So you see, a lot of potential duplicates. Q1: Should I create canonicals for a: example.com/category-abc/ to example.com/all-categories/ b: example.com/reviews/product-xy/abc-your-product-is-great/ to example.com/reviews/product-xy/ or to example.com/review/ or none of them? Q2: Can I link the collection of categories (all-categories/) and collection of all reviews (reviews/ and reviews/product-xy/) to the single category respectively to the single review. Example: example.com/reviews/ includes - let's say - 100 reviews. Can I somehow use a markup that tells search engines: "Hey, wait, you are now looking at a collection of 100 reviews - do not index this collection, you should rather prefer indexing every single review as a single page!". In HTML it might be something like that (which - of course - does not work, it's only to show you what I mean): <div class="review" rel="canonical" href="http://example.com/reviews/product-xz/abc-your-product-is-great/">HERE GOES THE REVIEW</div> Reason: I don't think it is a great user experience if the user searches for "your product is great" and lands on example.com/reviews/ instead of example.com/reviews/product-xy/abc-your-product-is-great/. On the first site, he will have to search and might stop because of frustration. The second result, however, might lead to a conversion. The same applies for categories. If the user is searching for category-Z, he might land on the all-categories page and he has to scroll down to the (last) category, to find what he searched for (Z). So what's best practice? What should I do? Thank you for your help!

    Read the article

  • What is the best currency API out there?

    - by YouBook
    I may be asking an odd question, but webmasters seem like a decent place to post this. I'm in the search for an accurate, easy API (such as using JSON or XML) so I can use my web application. I've been trying Google's secret API, but it's dependency isn't that good because of parsing the string (weird JSON format that PHP sometimes return incorrect data or truncates the string due to parsing error, but Google's API is fair but can be improved. So, all I'm asking is your current best currency API out there, I want to have them to include documented API so I can use it with PHP. Cheers.

    Read the article

  • case-specific mod rewrite on Wordpress subdomain multisite

    - by Steve
    I have split a Wordpress blog into multiple category-specific blogs using subdomains, as the topics in the original blog were too broad to be lumped together effectively. Posts were exported from the parent www blog and imported into the subject-specific subdomain blogs. I believe .htaccess provides mod rewrite for all subdomains (including the original www) in a single .htaccess file. I use .htaccess to perform 301 redirect on post categories to the relevant post on the subdomain's blog. eg: RedirectMatch 301 ^/auto/(.*)$ http://auto.example.com/$1 The problem I have is that the category has been retained in the permalink structure in the subdomain blog, so that www.example.com/auto/mercedes is now auto.example.com/auto/mercedes. The 1st URL is redirect to the 2nd, but unfortunately, the 2nd URL is redirected to auto.example.com/mercedes using the same rewrite rule, which is not found, as the permalink on the subdomain's blog retains the parent category of auto. The solution would be to adjust the permalink structure in the subdomain's WP settings, so that the top level category does not duplicate the subdomain. My question would be: how do I then strip a section of the original (www) blog's post URL from the subdomain's URL when redirecting? eg: How do I redirect www.example.com/auto/mercedes to auto.example.com/mercedes? I'm assuming this would be a regular expression trick, which I am not great at. Update: I might have to use: RewriteCond %{HTTP_HOST} !auto.example.com$ in the default Wordpress if loop in .htaccess, and seperate my custom subdomain redirections into a second if loop section.

    Read the article

  • How to reload google dfp in ajax content? [migrated]

    - by cj333
    google dfp support ads in ajax content, but if I parse all the code in main page. it always show the same ads even turn page reload the ajax content. I read some article from http://productforums.google.com/forum/#!msg/dfp/7MxNjJk46DQ/4SAhMkh2RU4J. But my code not work. Any work code for suggestion? Thanks. Main page code: <script type='text/javascript'> $(document).ready(function(){ $('#next').live('click',function(){ var num = $(this).html(); $.ajax({ url: "album-slider.php", dataType: "html", type: 'POST', data: 'photo=' + num, success: function(data){ $("#slider").center(); googletag.cmd.push(function() { googletag.defineSlot('/1*******/ads-728-90', [728, 90], 'div-gpt-ad-1**********-'+ num).addService(googletag.pubads()); googletag.pubads().enableSingleRequest(); googletag.enableServices(); }); } }); }); }); </script> album-slider.php <!-- ads-728-90 --> <div id='div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>' style='width:728px; height:90px;'> <script type='text/javascript'> googletag.cmd.push(function() { googletag.display('div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>); }); </script>

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >