Search Results

Search found 6198 results on 248 pages for 'traffic filtering'.

Page 142/248 | < Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >

  • Possible automated Bing Ads fraud?

    - by Gary Joynes
    I run a website that generates life insurance leads. The site is very simple a) there is a form for capturing the user's details, life insurance requirements etc b) A quote comparison feature We drive traffic to our site using conventional Google Adwords and Bing Ads campaigns. Since the 6th January we have received 30-40 dodgy leads which have the following in common: All created between 2 and 8 AM Phone number always in the format "123 1234 1234' Name, Date Of Birth, Policy details, Address all seem valid and are unique across the leads Email addresses from "disposable" email accounts including dodgit.com, mailinator.com, trashymail.com, pookmail.com Some leads come from the customer form, some via the quote comparison feature All come from different IP addresses We get the keyword information passed through from the URLs All look to be coming from Bing Ads All come from Internet Explorer v7 and v8 The consistency of the data and the random IP addresses seem to suggest an automated approach but I'm not sure of the intent. We can handle identifying these leads within our database but is there anyway of stopping this at the Ad level i.e. before the click through.

    Read the article

  • URL Rewrite http to https EXCEPT files in a specific subfolder

    - by BrettRobi
    I am trying to force all traffic on my web site to use HTTPS, using the URL Rewrite 2.0 module added to IIS 7.5. I got that working and now have a need to exclude a couple of pages from using SSL. So I need a rule to rewrite all URL except those referencing this folder to HTTPS. I've been banging my head against the wall on this and am hoping someone can help. I tried creating a rule to match all URL except those in a nossl subfolder as in this example: <rule name="HTTP to HTTPS redirect" enabled="true" stopProcessing="true"> <match url="(/nossl/.*)" negate="true" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="https://{HTTP_HOST}/{R:1}" redirectType="Found" /> </rule> But this doesn't work. Can anyone help?

    Read the article

  • Edit 100MB+ file

    - by Majid Fouladpour
    I have captured some traffic with Wireshark and saved the result as a file. The file has 3 sections now: request headers response headers response body The response body is to become an flv file, but now everything is saved as a single file. So I need a way to delete the first two sections from the file, but the problem is that the file is very big (over a thousand mega bytes). I have tried to open it with gedit, but no matter how long I wait, gedit hangs and remains unresponsive until I kill it. What tool can I use to edit this big file easily?

    Read the article

  • LiveMeeting VC PowerShell PASS – Troubleshooting SQL Server with PowerShell

    - by Laerte Junior
    Guys, join me on Wednesday July 18th 12 noon EDT (GMT -4) for a presentation called Troubleshooting SQL Server With PowerShell. It will be in English, so please make allowances for this. I’m sure that you’re aware that my English is not perfect, but it is not so bad. I will do my best, you can be sure. The registration link will be available soon from PowerShell.sqlpass.org, so I hope to see you there. It will be a session without slides. Just code; pure PowerShell code. Trust me, We will see a lot of COOL stuff.Big thanks to Aaron Nelson (@sqlvariant) for the opportunity! Here are some more details about the presentation: “Troubleshooting SQL Server with PowerShell – The Next Level’ It is normal for us to have to face poorly performing queries or even complete failure in our SQL server environments. This can happen for a variety of reasons including poor Database Designs, hardware failure, improperly-configured systems and OS Updates applied without testing. As Database Administrators, we need to take precaution to minimize the impact of these problems when they occur, and so we need the tools and methodology required to identify and solve issues quickly. In this Session we will use PowerShell to explore some common troubleshooting techniques used in our day-to-day work as s DBA. This will include a variety of such activities including Gathering Performance Counters in several servers at the same time using background jobs, identifying Blocked Sessions and Reading & filtering the SQL Error Log even if the Instance is offline The approach will be using some advanced PowerShell techniques that allow us to scale the code for multiple servers and run the data collection in asynchronous mode.

    Read the article

  • Generating AdSense ad impression?

    - by Danny
    I owned two website. Let's say as example www.first.com and www.second.com. I have two apps in Google chrome-store whose app URL is: app1 URL = www.first.com/currency_converter.php app2 URL = www.first.com/currency_converter.php respectively. Currently if user lunch/click on first app URL then I am forwarding them to my first website homepage (www.first.com), where I have AdSense code with currency converter app. In this way I am generating Google ad impression and traffic for my first website (i.e www.first.com). So is this valid way? Does I am violating any Google AdSense rule? Please provide your reply or suggestions. P.S.: Please comment, if my question is not clear, I will try to write it in some other way.

    Read the article

  • over reporting in google analytics - social media stats

    - by colmcq
    I have a client and their traffic from social media reads thus: 80% from facebook 1% from Twitter This suggestsd they are not exploiting twitter at all and this was in my presentation but my boss took it out claiming twitter stats are under reported in google analytics. I can't substantiate this claim and wonder where she got this idea from. Can anyone shed light on this? Are my stats wrong and should I disregard these figures? but 80-1 seems like one hell of an under report! thanks c

    Read the article

  • How to distribute a unique database already in production?

    - by JVerstry
    Let's assume a successful web spring application running on a MySql or PostGre kind of database. The traffic is becoming so high and the amount of data is becoming so big that a distributed dataase solution needs to be implemented. It is a scalability issue. Let's assume this application is using Hibernate and the data access layer is cleanly separated with DAO objects. What would be the best strategy to scale this database? Does anyone have hands on experience to share? Is it possible to minimize sharding code (Shard) in the application? Ideally, one should be able to add or remove databases easily. A failback solution is welcome too. I am not looking for you could go for sharding or you could go no sql kind of answers. I am looking for deeper answers from people with experience.

    Read the article

  • Collision Detection fails with AI cars

    - by amit.r007
    I am making a car parking game in flash and AS3 wherein I drive my car along with other AI traffic cars moving along a specified path using Guidelines. I am using CDK for collision detection. The collision detection works fine with few AI cars, but doesn't seems to be working as required for few AI cars. When an AI car is moving on a path in a straight line it works fine.... but when the AI Car turns at 90 degress..... my car goes into the AI car (Overlapping) and it hits at the center of that AI car and then collision is Detected.... ..... I made a New path and used a new Sprite for AI car... but still the problem pursues....

    Read the article

  • Hosting advice for a write-heavy dynamic website

    - by Rahul Rawat
    I have built a website using PHP and MySQL and now I am looking for a hosting service. I am expecting about a 1000 users registering and about 5-10k pageviews/day in a week's time. So which host should I opt for? It will let users submit contents of type blobs and submit around 10 pictures per users. I hope that traffic will increase so can justhost's or bluehost's shared hosting serve that purpose or should I go for more dedicated ones. Basically the site is write heavy and there are average 2-3 MySQL queries per page and it is quite dynamic. So depending on these requirements which web hosting will be optimal for me.

    Read the article

  • How can I disable recent documents in Unity?

    - by detly
    How do I disable the tracking and display of recently opened files (and whatever else is remembered) in a default installation of Ubuntu 11.10? (Note that this is not a duplicate of How can I keep recent files from appearing in Unity?, since that question and its answers are concerned with temporary and specific filtering. I want to disable it completely for a single user account.) Okay, to deflect the inevitable and expand on my motivation... While trawling the usual forums and Google results for a solution, it (unsurprisingly) seems that the near-universal use cases for this request are either browsing porn or Warhammer research. And the obvious solution to this is to create another user account to contain all evidence. However, this is not why I'm asking, and I don't say that to get all high and mighty about it, it's because this answer won't help. (Even though I really don't have any interest in Warhammer, and I have no idea how that paint pot and brush ended up in my drawer, no that's not glue on my thumb, etc.) My actual use case is that I use my personal laptop for presentations in different circles of my life. I have a user account set up with all the settings I like for presentations (shortcuts, small launcher, default associations, etc). But I don't want an accidental keystroke (or the find dialog) to display other recent presentations I've given, or the files I used in composing the presentation, or whatever. I also don't want to have to recreate this profile for every single presentation I might give. I just want a nice little isolated, memoryless, clean corner of my notebook for public display.

    Read the article

  • SQL Rally Nordic & Amsterdam slides & demos

    - by Davide Mauri
    Last week I had the pleasure to speak at two GREAT conferences (as you can see from the wordcloud I’ve posted, here for Stockholm and here for Amsterdam. I used two different filtering techniques to produce the wordcloud, that’s why they look different. I’m playing a lot with R in these days…so I like to experiment different stuff). The workshop with my friend Thomas Kejser on “Data Warehouse Modeling – Making the Right Choices” and my sessions on “Automating DWH Patterns through Metadata” has been really appreciated by attendees, give the amount of good feedback I had on twitter and on some blog posts (Here and here). Of course many asked for slides & demos to download, so here you are! Automating DWH Patterns through Metadata – Stockholm http://sdrv.ms/1bcRAaW Automating DWH Patterns through Metadata – Amsterdam http://sdrv.ms/1cNDAex I’m still trying to understand if and how I can publicly post slides & demos of the workshop, so for that you have to wait a couple of days. I will post about it as soon as possible. Anyway, if you were in the workshop and would like to get the slide & demos ASAP, just send me an email, I’ll happily sent the protected link to my skydrive folder to you. Enjoy!

    Read the article

  • No Obvious Answer - Query-Strings and Javascript

    - by nchaud
    Say I have this main page /my-site/all-my-bath-soaps which lists all my products. It has a search filter text box that uses javascript to filter the products they want to see on that page (the URL doesn't change as they filter). Now from many other parts of the site I want to navigate to this products-page and see specific products. E.g. <a href="/my-site/all-my-bath-soaps?filter='Nivea-Soap'"> will go to /all-my-bath-soaps and apply javascript filtering to see just that product and hide all dom nodes for the other products. The problem is if the user changes the text in the filter from 'Nivea-Soap' to 'Lynx' the javascript will work fine and show the new products but the URL stays at ?filter='Nivea-Soap'. Is there anything I can do about this? Of course, I don't want to reload the page with a new query string every time they change the search criteria. Somehow it'd be great to move the ?filter=... criteria into POST data instead - but how can I do this with a link I don't know...

    Read the article

  • Can using an apt proxy (d-i mirror/http/proxy string http://mymirror) affect the installation of a .deb?

    - by Randolph
    I have been doing Ubuntu deployment using a preseed.cfg. After becoming comfortable with the packages being installed it was time to reduce download time and internet traffic by creating a mirror. I ended up doing a "partial mirror" using apt-cacher-ng and preseeding it by adding d-i mirror/http/proxy string http://mymirror to the preseed.cfg. This is where things got strange. I have a few .debs that I run as part of preseed/late_command by wgetting them and installing them with dpkg -i. The packages were installing without issue until added the proxy. With the proxy they fail to install. So does the proxy affect installing .debs during preseeded installation?

    Read the article

  • A few tips on deploying Secure Enterprise Search with PeopleSoft

    - by Matthew Haavisto
    Oracle's Secure Enterprise Search is part of PeopleSoft now.  It is provided as part of the Peopltools platform as an appliance, and is used with applications starting with release 9.2.  Secure Enterprise Search is a rich and powerful search product that can enhance search and navigation in PeopleSoft applications.  It also provides useful features like facets and filtering that are common in consumer search engines.Several questions have arisen about the deployment of SES and how to administer it and insure optimum performance.  People have also asked about what versions are supported on various platforms.  To address the most common of these questions, we are posting this list of tips.Platform SupportSES 11.1.2.2 does not support some of the platforms supported by PeopleTools, such as Windows 2012 and AIX 7.1. However, PeopleSoft and SES can use different operating system platforms when SES is deployed on a separate machine.SES 11.2.2.2 will have the required platform support for PT 8.53 in the future. We are planning to certify PT 8.53 once the testing is complete in 8.54 development and all platform support is released for 11.2.2.2.ArchitectureWe recommend running SES on a separate machine (from your apps) for two reasons:1.    SES bundles specific WebLogic, Java, and Oracle DB versions and might need different OS patches at a minimum than PeopleSoft. By having SES run on a different machine, these pre-requisites can be managed better through their lifecycle independenly for PeopleSoft and SES.2.    SES is resource intensive - it runs it's own WebLogic and Oracle database. By having SES run on its own machine, sufficient resources can be allocated to SES and free the PeopleSoft servers from impacts of SES load patterns.We will be providing a comprehensive red paper covering PeopleSoft/SES administration in the near future, but until that is published, we'll post tips on this blog.

    Read the article

  • How to store prices that have effective dates?

    - by lal00
    I have a list of products. Each of them is offered by N providers. Each providers quotes us a price for a specific date. That price is effective until that provider decides to set a new price. In that case, the provider will give the new price with a new date. The MySQL table header currently looks like: provider_id, product_id, price, date_price_effective Every other day, we compile a list of products/prices that are effective for the current day. For each product, the list contains a sorted list of the providers that have that particular product. In that way, we can order certain products from whoever happens to offer the best price. To get the effective prices, I have a SQL statement that returns all rows that have date_price_effective >= NOW(). That result set is processed with a ruby script that does the sorting and filtering necessary to obtain a file that looks like this: product_id_1,provider_1,provider_3,provider8,provider_10... product_id_2,provider_3,provider_2,provider1,provider_10... This works fine for our purposes, but I still have an itch that a SQL table is probably not the best way to store this kind of information. I have that feeling that this kind of problema has been solved previously in other more creative ways. Is there a better way to store this information other than in SQL? or, if using SQL, is there a better approach than the one I'm using?

    Read the article

  • How long should I keep 301 redirecting pages from a deprecated domain?

    - by ElHaix
    I had an old domain that I have deprecated, but 301 redirected all results from it to my new site. The new site is now receiving a decent amount of traffic, but I don't know if it's 301 redirected from the old site, and doing a site:[old site] still shows several thousand pages indexed. Since all pages from the old site are 301 redirected, will they ever be removed from the index, as long as the old domain name is active? As a rule of thumb, somewhere I got 90 days for any significant site changes. When is it safe to burn the old domain?

    Read the article

  • Are long methods always bad?

    - by wobbily_col
    So looking around earlier I noticed some comments about long methods being bad practice. I am not sure I always agree that long methods are bad (and would like opinions from others). For example I have some Django views that do a bit of processing of the objects before sending them to the view, a long method being 350 lines of code. I have my code written so that it deals with the paramaters - sorting / filtering the queryset, then bit by bit does some processing on the objects my query has returned. So the processing is mainly conditional aggregation, that has complex enough rules it can't easily be done in the database, so I have some variables declared outside the main loop then get altered during the loop. varaible_1 = 0 variable_2 = 0 for object in queryset : if object.condition_condition_a and variable_2 > 0 : variable 1+= 1 ..... ... . more conditions to alter the variables return queryset, and context So according to the theory I should factor out all the code into smaller methods, so That I have the view method as being maximum one page long. However having worked on various code bases in the past, I sometimes find it makes the code less readable, when you need to constantly jump from one method to the next figuring out all the parts of it, while keeping the outermost method in your head. I find that having a long method that is well formatted, you can see the logic more easily, as it isn't getting hidden away in inner methods. I could factor out the code into smaller methods, but often there is is an inner loop being used for two or three things, so it would result in more complex code, or methods that don't do one thing but two or three (alternatively I could repeat inner loops for each task, but then there will be a performance hit). So is there a case that long methods are not always bad? Is there always a case for writing methods, when they will only be used in one place?

    Read the article

  • Why Google Analytics is displaying wrong landing pages?

    - by Salman
    I see all of my pages as Landing Pages in Google Analytics which cannot be true as I did not post those pages anywhere and I don't see any traffic hitting directly to that page. Also, I am using virtual page views on few buttons and I see those virtual pages as Landing pages too. For example, /click/request-a-quote 35000 views 35000 is too big a number to be ignored. Even if I ignore Virtual Pages Views, I see a lot of pages as Landing Pages that I am 100% sure that visitors ( atleast not so many users) are NOT hitting directly. Any advice, how to debug it? PS: I'm using the following code: var _gaq = _gaq || []; _gaq.push(['_setAccount', '<']); _gaq.push(['_setDomainName', 'none']); _gaq.push(['setLocalGifPath', '/images/_utm.gif']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview','account/phase1']);

    Read the article

  • Laptop not syncing with Ubuntu One

    - by Lolwhites
    I have a desktop running Maverick, and a laptop running Lucid. Both are supposed to be linked to my Ubuntu One account. The desktop syncs fine, but when I started the laptop for the first time in a couple of months, it wouldn't sync any more. The Ubuntu One Preferences window either reports "Synchronisation complete" even though no recent files have been downloaded nor have any test files created in the relevant synced folder been uploaded, or it says "Synchronisation in progress", which does not appear to be happening as it stays like this for ages and the lights on my router suggest no traffic is going through. Have repeatedly tried disconnecting and reconnecting, and removing the device from the account then reattaching it, all to no avail.

    Read the article

  • How to structure my AdWords campaign for testing and different groups of keywords?

    - by Romain Dorange
    I am starting an AdWords campaigns and I will measure conversion rates using the AdWords conversion tracking pixel. Conversion might be account creation or a concrete sale. As it will be a test campaign to have some insights on CTR, CR, etc... on the future, I am likely to try several configurations: Two different ads with different landing URL and messages: one with a focus on the product / the other will contains a discount embedded in the URL. 4 different groups or themes of keywords. I guess I have to build 4 ads groups based on the keywords 2 ads with the different messages assign the two ads to each ads groups follow the campaign precisely in the ads tabs where I can see the effectiveness of each Ads per Ads Groups (for a total of 8 lines of reporting) Also, what are the key performance indicators that I can have from an AdWords campaign to measure global effectiveness? measure of return on investment from concrete sales (tracking pixel with e-commerce tag on confirmation page) measure o return on investment from leads acquisition (tracking pixel on account creation) measure of traffic increase with the campaign

    Read the article

  • Ensuring ethernet is configured before continuing init scripts.

    - by Pete Ashdown
    Is there a better way to ensure that an ethernet port is configured before continuing through startup init scripts? When 802.3ad bonded ethernet is configured on Ubuntu, it takes some time before it finishes protocol negotiation and starts passing packets, because the networking script just configures, but does not verify that traffic is being passed. As a result, this can throw off some of the other network dependent scripts, like the init for drbd. Right now, I just have a loop that pings the gateway in a startup script, but this seems less than optimal: GATEWAYIP=10.0.0.1 while ( ! ping -c 1 $GATEWAYIP ); do echo gateway not up done

    Read the article

  • Do search engines rank internal redirects negatively?

    - by siverd
    A client is in the late stages (code complete) of a website redesign and unfortunately hasn't implemented 301 redirects to point high traffic pages to the new URL's. As I understand it our only option at this point is to create redirects within the CMS. Our CMS allows us to do this: www.mysite.com/category/current-page.html will redirect to www.mysite.com/new-category-name/new-page.html The site now uses custom logic on our 404 page to check this list of redirects and if one exists forwards the user to the new-page.html I understand that using 301 redirects would be the correct way to maintain our page rank but I think that would require a code change which isn't possible. Question How will search engines respond to this? Will they wait until the redirect happens and allow us to keep our page rank (authority, trust, etc) or will they see the 404 page and down-rank us? Worst case...will they make our new-page.html start from a rank of "0"? Thanks for your help.

    Read the article

  • Question about server usage, big community platform

    - by Json
    I’m working on a community platform writen in PHP, MySQL. I have some questions about the server usage maybe someone can help me out. The community is based on JQuery with many ajax requests to update content. It makes 5 - 10 AJAX(Json, GET, POST) requests every 5 seconds, the requests fetch user data like user notifications and messages by doing mySQL queries. I wonder how a server will handle this when there are for more than 5000 users online. Then it will be 50.000 requests every 5 seconds, what kind of server you need to handle this? Or maybe even more, when there are 15.000 users online, 150.000 requests every 5 seconds. My webserver have the following specs. Xeon Quad 2048MB 5000GB traffic Will it be good enough, and for how many users? Anyone can help me out or know where to find such information, like make a calculation?

    Read the article

  • Migrate from Thunderbird to Mutt

    - by deshmukh
    I am contemplating moving from Thunderbird to Mutt (provided it is feasible) to move to a faster, simpler application. My current Thunderbird set-up consists of multiple IMAP accounts (gmail and google apps). Only selected folders (read labels) in each IMAP account are stored locally. For all other folders, I glance through the headers and open a message only if I find it interesting. I also use folder bookmarks to navigate to folders quickly. I also move messages across folders with keyboard shortcuts. Is it possible to replicate the set-up in Mutt? Can someone share/ point to a sample muttrc file that does the same thing? It would be great if the muttrc file is adequately commented. On a side note, will it also be possible to import my messages from Thunderbird locally? That will save me considerable network traffic (about 2GB data stored locally).

    Read the article

  • All About Search Engine Position Optimization

    Search engine optimization or SEO is a method increasing the amount of traffic or hits to your website, which results in making your website rank high in search engine results. These results are produced whenever an individual types in a keyword or a set of keywords in a search query in search engines like Yahoo!, Google and the like. Being high on the list of search results matters a lot because it makes you more visible to the general public, especially to your target market. This differentiates you from your competitors who may rank low in the search results, or may not even appear in the results lists at all.

    Read the article

< Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >