Search Results

Search found 20409 results on 817 pages for 'url routing'.

Page 458/817 | < Previous Page | 454 455 456 457 458 459 460 461 462 463 464 465  | Next Page >

  • GA tracking utm query params after hashbang

    - by hybrid9
    We currently use a hashbang for the portion of our site that generates dynamic content which can also be deep linked. Our analytics team wants to use utm params to track the referral traffic from social networks. We are using Universal Analytics (analytics.js) as well as GTM. Will GA pick up the query parameters after the hashbang or does it always have to go before? For example: example.com/#!/some/content?utm_source=foo&utm_campaign=bar example.com?utm_source=foo&utm_campaign=bar/#!/some/content In #1, I'm concerned that the utm params won't be recorded and in #2 the page will break or the url could be incorrectly written. How does GA pull in those parameters - location.search? regex? Can I get away with using either?

    Read the article

  • weird text stuttering

    - by user85724
    i've been having a strange problem with ubuntu lately. i searched in the forums and so but couldn't find anything. the things is that text in firefox and text editors stutters or freezes randomly. for example, when i am writing and url and suddenly want to erase it, the last letters freeze in their place for a few seconds while the others are being erased. or when i am reading something in the text editor, it mixes paragraphs and such, like an old and crazy epson inkjet printer.. please help?

    Read the article

  • View the Replay of Fusion Applications - Financials Highlights Lunchtime Webinar

    - by Theresa Hickman
    If you missed the free one hour Webinar and training on Feb. 28 that I blogged about in a previous blog post, you can view the replay at your convenience. Fusion Applications – Financials Highlights URL: http://education.oracle.com/education/downloads/Fusion_Financials_LVCRecording/ If it asks you for a code, enter E1049. For those of you who pre-registered for this Webinar but due to the timezone difference could not attend, you should have also received an email from Oracle University with a link to the replay. Enjoy!  

    Read the article

  • ERROR CHECKING !!

    - by moata_u
    am trying catch any error when run command in order to write an log file / report i was trying write this code : FUNCTION FOR VALIDATION function valid (){ if [ $? -eq 0 ]; then echo "$var1" ": status : OK" else echo "$var1" ": status : ERROR" fi COMMAND FUNCTION function save(){ sed -i "/:@/c connection.url=jdbc:oracle:thin:@$ip:1521:$dataBase" $search var1="adding database ip" valid $var1 sed -i "/connection.username/c connection.username=$name" #$search retval=$? var1="addning database SID" valid $var1 $retval } save OUTPUT adding database ip : status : OK sed: no input file i want out put in this way: adding database ip : status : OK sed: no input file : status : ERROR" (OR) adding database ip : status : OK addning database SID : status : ERROR" I was tried toooo much but not working with me :(((

    Read the article

  • Tool Review: Telerik JustDecompile

    - by Sam Abraham
    In the next few lines, I will be providing a brief review of Telerik’s JustDecompile, a free .Net decompiler and assembly browser. In using Telerik’s 2012 Q3 JustDecompile release, one can see many great features.  First off, I loved the built-in options for loading .Net assemblies automatically using the Open->Load Framework menu option. Other options enable loading assemblies from GAC, XAP URL or locally from disk. The ability to create an “Assembly List” is quiet handy for grouping and saving a “List” of DLLs to load. All loaded assemblies are shown in the left panel of a split-panel screen. Clicking an assembly expands all namespaces within. Drilling further to class level displays the actual source code in the right panel in either IL, C# or Visual Basic. In conclusion, JustDecompile has grown and quickly matured into an indispensible handy tool for us developers. Telerik’s effort in maintaining and updating JustDecompile as well as the company’s commitment to keeping it free is much appreciated and valued.

    Read the article

  • Using Mod_Rewrite To Block Referrer Based On Domain Extenstion?

    - by Matt
    I've been in web development for several years now (I'm a student web designer), and recently, I've begun to experiment with mod_rewrite for things like URL shortening. I was wondering, is it possible to block a referrer by domain extension, instead of just by full site, etc.? So, instead of RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} examplesite\.com [NC] RewriteRule .* - [F] could you do RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} \.com [NC] RewriteRule .* - [F] without the full domain name? Thanks. I'm fairly knowledgeable about other web dev / hosting topics, but mod_rewrite is new to me and Google wasn't helping.

    Read the article

  • Remote Desktop fails after VPN connection

    - by Samet Sorgut
    The local computer (comp 1) is connected to a remote computer (comp 2) with Remote Desktop. On the remote computer (comp 2), I try to establish an VPN connection to a different remote computer (comp 3). Once I try to establish the VPN connection from the remote computer (comp 2) to the second remote computer (comp 3), Remote Desktop freezes on comp 1. It is not possible to connect to comp 2 again via Remote Desktop. What can be done to connect to this remote computer (comp 2) after it establishes a VPN connection? The only thing that comes to my mind is to install a second NIC and configure Remote Desktop to accept connection from this NIC while VPN is working from the other... What do you suggest? EDIT: I want to use the internet connection of the VPN, so all traffic should go over the VPN but still RDP working. My IP: 100.0.0.1 The IP where I'm connecting via RDP: 200.0.0.20 (Mask: 255.255.255.192, Gateway: 200.0.0.193) Where the 200.0.0.1 connects to VPN the IP of the VPN is: 65.254.61.250 Will routing like this help (Command is issued in 200.0.0.20, the RDP location): route ADD 65.254.61.250 MASK 255.255.255.192 200.0.0.193 Couldn't add gives the error: The route addition failed: The parameter is incorrect. I tried before connecting to VPN.

    Read the article

  • SQL Server 2008 R2: StreamInsight changes at RTM: Event Flow Debugger and Management Interface Secur

    - by Greg Low
    In CTP3, I found setting up the StreamInsight Event Flow Debugger fairly easy. For RTM, a number of security changes were made. First config: To be able to connect to the management interface, your user must be added to the Performance Log Users group. After you make this change, you must log off and log back on as the token is only added to your login token when you log on. I forgot this and spent ages trying to work out why I couldn't connect. Second config: You need to reserve the URL that the...(read more)

    Read the article

  • WPF Alphabet (Available for download)

    - by mbcrump
    WPF Alphabet is a application that I created to help my child learn the alphabet. It displays each letter and pronounces it using speech synthesis. It was developed using WPF and c# in about 3 hours (so its kinda rough). I went ahead and uploaded it to codeplex for those in similar situation or just wanting to see a particular WPF feature. I would also recommend Scott Hanselman’s BabySmash!. Specific WPF Features: DispatcherTimer (WPF) SpeechSynthesizer (WPF) URL Navigate (WPF) not PAGE XAML Examples: DockPanel Border TextBlock HyperLink Button's and Events Download full source and binaries here.

    Read the article

  • Adding your website to free web directories as a link building strategy

    - by Man
    It's been two month I've launched a website. I recently ran into some websites which list directory of other websites. Some examples can be http://www.addgoogleurl.com/ and webdirectorieslist.com, etc. I was talking with my colleague and he says, adding the URL of my website to these kind of websites will negate the effect of other organic real links. Does google consider positive/negative points for these kind of links from web directory websites? Do you have any source for your answer to refer to? I found this question asked before on webmasters.SE, this is asking about many links from a website.

    Read the article

  • Redirect public traffic to a different subfolder, while local traffic remains unchanged

    - by ecnepsnai
    I would like to have local (intranet) HTTP traffic go to the /var/www/html folder while any public traffic goes to the subfolder, /var/www/html/public I've tried this configuration, with some variation, in httpd.conf <VirtualHost PRIVATE-IP> DocumentRoot /var/www/html ServerName ecn ErrorLog /var/www/logs/error/private CustomLog /var/www/logs/access/private common </VirtualHost> <VirtualHost PUBLIC-IP> DocumentRoot /var/www/html/public ServerName PUBLIC-DOMAIN-NAME ErrorLog /var/www/logs/error/public CustomLog /var/www/logs/access/public common </VirtualHost> PUBLIC-IP, PRIVATE-IP, and PUBLIC-DOMAIN name are all replaced with the correct values in the actual document. The problem is, local traffic can browse fine but remote traffic is directed to the root folder and getting 403d (because I have that folder blocked off through my .htaccess file). If I append /public to the URL it works fine.

    Read the article

  • Grub can't find device on boot resulting in Grub Rescue

    - by user1160163
    So I have 2 hard drives a HDD 320GB and a SSD 20GB. Before I had Windows 7 on the HDD and Ubuntu on the SSD but wanted to get rid of windows and reinstall a clean Ubuntu on the SSD then use the HDD for storage. So I deleted everything from the HDD and set up the SSD with 18GB ext4 and 2GB Swap and installed Ubuntu on the 18GB ext4. Though now when I boot up I get "Error: No such device Grub Rescue" I have a live USB and I ran the Boot Repair following these instructions - grub rescue after install of Ubuntu 12.04 (dual boot) - it says successful though still have the same problem. This is the given URL from Boot Repair - http://paste.ubuntu.com/1257988/ Thanks for any help given.

    Read the article

  • SEO and multiple domains to same site

    - by mwb
    I have one website. I have two domain names that I want to point to the same site install. So whether you go to name-one.com or name-two.com you see the exact same site. Now, I can either set up name-two.com to serve 301 redirect header redirecting to name-one.com – or, I can set up name-two.com as a CNAME in the DNS pointing to name-one.com What is the different implications for SEO on this? What is recommended? I would guess it's better for branding to use a 301 redirect, so that visitors will see one consistent url for my site, right? The reason I want the two domains is that I want a version with regional letters ('ö' instead of 'oe' ) in the name.

    Read the article

  • AGPL License - does it apply in this scenario?

    - by user1645310
    There is an AGPLv3 based software (Client) that makes web service calls (using SOAP) to another software (Server - commercial, cloud based). There is no common code or any connection whatsoever between these two except for the web service calls being made. My questions - Does the Server need to be AGPL too? I guess not - but would like to confirm. Let us say the end point URL for the Server can be configured on the Client side (by editing an XML file) to connect it to different Servers (again, there is no connection other than the webservice calls being made) does it require any of these Servers being AGPL? Are there any issues in running the Client as a DLL that is loaded by other commercial applications on users' desktops? Does it require these other applications also to be AGPL?

    Read the article

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • Why do XML namespace URIs use the http scheme?

    - by svick
    A XML namespace should be a URI, but it can use any URI scheme, including those that are not URLs. Then why do all widely used XML namespaces use the http scheme (e.g. http://www.w3.org/XML/1998/namespace), considering that trying to use the URI as an URL by retrieving that document using the HTTP protocol is not guaranteed to do anything useful (and often doesn't)? I understand the usefulness of DNS domains in namespace names to help guarantee uniqueness. But that doesn't require the http scheme, there could be a separate scheme for that (something like namespace:w3.org/XML/1998/namespace). This would avoid the confusion between URIs and URLs, while maintaining domain-based uniqueness.

    Read the article

  • jQuery .Flv Player

    - by H(at)Ni
    I've recently used a cool .Flv player to play video files that are uploaded by site admin and thought it's good to share. Download from: http://flowplayer.org/ Using this control is very simple, just follow those steps : 1. Create an anchor with href equals to the video url  <a href="Video.flv" id="MyPlayer" ></a>   2. Add path to the player's javascript file <script type="text/javascript" src="flowplayer-3.2.4.min.js"></script> 3. Simple call to a function named flowplayer with the anchor id, and path of the player's swf file     <script type="text/javascript">       $(document).ready(function () {       flowplayer("MyPlayer", "flowplayer-3.2.5.swf");       });     </script>

    Read the article

  • VirtualBox Port Forward

    - by john.graves(at)oracle.com
    A great new feature in VirtualBox 4.0 is the ability to use NAT networking and forward ports without needing to use ssh -L/-R tricks.  This is great for booting multiple VM domains simultaneously.  It is possible to have several instances which map back to the host machine and different ports on localhost:* automatically forward to the correct VM.  This avoids the hassle of setting up dns entries or static IP addresses.In this example, I'm mapping the host ports 3xxxx to the VM's well known server ports.Note: It is important to setup the Frontend HTTP host/port to avoid incorrect URL rewriting.You may also need to setup an http channel to deal with local traffic which uses the network address 10.0.2.15Happy VMing.

    Read the article

  • wrong SEO result for staging website

    - by OCD
    Hi, We have a domain with URL, lets say: http://www.xxxxstaging.com/abc.php?id=10 which is having a pretty good ranking for some keywords in google. However, this is our staging website and due to some historical reason, it was exposed to public. Now we have a production domain (the one we want to be publicly available), say: http://www.xxxxproduction.com/abc.php?id=10 which have exactly the same html output as xxxxstaging.com our question is, for the exactly same keyword search, it never appear in any page of google. Is it possible for us to "switch" the ranking in google, either tell the robot that they are indeed the same website, or tell google to change it for us? Thank you.

    Read the article

  • 301 redirects - can we not delete old pages?

    - by KBS
    First time here :) We have a page on the site which ranks well for an SEO term (top 5) but contains old information. We have added a new page but Google doesn't rank it that well. Information on these pages is time sensitive. Old: example.com/2013-related-information.html New: example.com/2014-related-information.html Obvious solution is to delete old page and do a 301 redirect to the new page. Now, can we still keep the old page by giving it a new URL. Step1: example.com/2013-related-information.html is redirect to example.com/2014-related-information.html Step2: example.com/2014-related-information.html is recreated with a new address such as example.com/new-2013-related-information.html What we are trying to do is to send the user to the fresh page but still not wasting the record copy if someone wants to go and dig up old page. Would appreciate help!! Cheers

    Read the article

  • How can I create a VLAN on my extreme switch for a seperate subnet/domain?

    - by drpcken
    I'm putting together a small active directory implementation for a buddy of mine. I currently have 2 servers (one is the primary domain controller) and a couple clients. I need to test and run updates on every machine on this domain, but I would have plug them into my current LIVE domain to get it internet access. From what I've read having two separate domains on a single subnet is a bad idea (even though it is temporary) so I don't want to risk messing anything up on my production domain. I'm pretty sure I can create a separate VLAN on my extreme 48 port switch and plug this smaller domain into it on a different subnet, but I don't know the commands. Both subnets would need internet access of course (one of the things I can't wrap my head around is routing internet traffic between subnets (gateway is on production subnet). My production domain is on subnet 192.168.200.0. My new domain I want to put online would go into subnet 192.168.10.0. A shove in the right direction would be greatly appreciated. Thank you!

    Read the article

  • Clean SOAP Calls from iOS - SudzC

    - by Richard Jones
    This is worth another mention. If you need to call SOAP web-services from iOS or Javascript, and lets face who doesn't. http://SudzC.com really delivers. You give it the URL to you're WSDL file (or upload a file) and it just spits out a ready to go Xcode project. I would point out that to get it to work 100% I changed line 204, in Soap.m (commented out line is old version, mine is below) //if([child respondsToSelector:@selector(name)] && [[child name] isEqual: name]) { if([child respondsToSelector:@selector(name)] && [[child name] hasSuffix: name]) { I consumed a Microsoft Dynamics NAV set of web-service pages no problem (and they tend to be fairly complex WSDL definitions).

    Read the article

  • help redirecting IP address

    - by Alice
    Google has indexed the IP address of my site rather than the domain, so now I'm trying to set up a 301 redirect that will redirect the IP address and all subsequent pages to the domain. I currently have something like this in my .htaccess file (however don't think it's working correctly?): RewriteCond %{HTTP_HOST} ^12.34.567.890 RewriteRule (.*) (domain address)/$1 [R=301,L] I've used various redirect checker tools and keep getting the message: "... not redirecting to any URL or the redirect is NOT SEARCH ENGINE FRIENDLY" Am I doing something wrong or is there something else I should be trying? Thanks! Alice

    Read the article

  • SEO Benefits of adding a Tumblr feed to site

    - by Paul
    A client of ours has a CMS driven Blog in his hotel site - he would like to use the blog to add depth top his site and add seo benefits relating to the blogs content. The current blog is a basic header / text field and doesn't contain any tagging / meta features. Unfortunately we dont have a .net developer in our team to alter the existing blog and add meta / tagging and there isn't budget to hire one - so I considered using a Tumblr blog - setting it up externally - giving it a blog.hotelname.com address and feeding it into the existing page via tumblrs js - which basically does a document.write into the page - which we can style. I understand from a previous post (Poor CMS blog vs Tumblr embed as a general rule most search engines ignore JS created content - but will the above approach act as an improvement on the existing system for now - as the blog will be setup externally with its own url and also feed into the existing site? Cheers Paul

    Read the article

  • Chromium and Google Chrome downloading

    - by user286166
    Well, I have sincerely enjoyed having Google Chrome on my linux machine, and have never had any problems until recently. I found that Google Chrome will simply not allow redirected downloads to start. I can download from direct links, but not pages that redirect and "Start in 3 seconds..." I immediately assumed it was the webpage itself, so I refreshed many times. I then restarted the browser, and after another failed attempt, my computer. After that point, I suspected my internet provider was to blame. I tried the redirection link in an alternative browser (Midori), and it worked perfectly fine. I decided it must be the version of Chrome that Google put out, so I quickly installed Chromium, and to my dismay, ran across the same problem. I can live with copying and pasting the url into Midori for redirected links, but I'd like the convenience of staying in my main browser. Thank you for any advice in advance. c:

    Read the article

< Previous Page | 454 455 456 457 458 459 460 461 462 463 464 465  | Next Page >