Search Results

Search found 44517 results on 1781 pages for 'google desktop search'.

Page 735/1781 | < Previous Page | 731 732 733 734 735 736 737 738 739 740 741 742  | Next Page >

  • Recent Solaris Studio how-to articles

    - by unixman
    There were a few Oracle Solaris Studio articles published recently, check'em out! -How to Develop Code from a Remote Desktop with Oracle Solaris StudioThis article describes the remote desktop feature of the Oracle Solaris Studio IDE, and how to use it to compile, run, debug, and profile your code running on remote servers.-How to Use Remote Development in the IDEThis article describes the modes of remote development available in the Oracle Solaris Studio 12.3 IDE and how to choose the best one for your development environment.-Performance Tips for the Oracle Solaris Studio IDEThis article describes some tips and tricks to help you improve the performance of the Oracle Solaris Studio IDE.

    Read the article

  • How do sites avoid SEO issues / legalities with subdomain unique ids?

    - by JM4
    I was looking through a few websites recently and noticed a trend I'm not sure I understand. Sites are creating unique referral URLs for customers in the form of: http://customname.site.com (If somebody were to use http://www.site.com/customname it would function the same way). I can see the sites are using 302 redirects at some point using Google Chrome then doing some sort of htaccess redirect, taking the subdomain name (customname) and applying it as a referral parameter then keeping in session during the entire process. However, there must be thousands of these custom URLs that people are typing in. How are each one of these "subdomains" not treated as separate URLs which in turn are redirected to the same page (in short, generating tons of links all pointing to the same page which Google would normally frown upon)? Additionally, the links also appear on the site themselves as clickable links so I'm not sure how these are not tracked. Similarly, the "unique" url is not indexed or cached in any Google search results. How is this capability handled? It does NOT highlight the referral aspect, but a true example of this is visiting http://sfgiants.com which does a 302 redirect to the much longer proper San Francisco Giants MLB homepage. I am wondering how SFgiants.com is not indexed (assuming that direct shortened link appears on several MLB pages)? 1 - I know these are 302 redirects, I can see this on the sites network flow. 2 - These links do in fact appear on the page itself because in some areas (for example, the bottom of the page may say: send this page to a friend! http://name.site.com/ which in turn would again redirect to something like http://www.site.com?id=name so the id value could be stored in session

    Read the article

  • Ubuntu One not syncing fully

    - by wurlyfan
    I have uploaded several folders of data to Ubuntu One from my desktop computer, over the last few weeks, and I can see that all the contents are there when I look at my account on the web. When I look at my laptop (connected to the same account), one of the first folders I uploaded hasn't downloaded completely. New folders added from either device seem to sync correctly, but this one older folder remains almost empty, even though the control panel says file syncing is up-to-date. I have plenty of space available. Stopping and restarting the sync daemon and rebooting the laptop are both ineffective. What can I do to make this folder sync fully? I don't want to risk losing the data (which now exists only in Ubuntu One), and I don't have a lot of broadband data to play with. I've seen several bugs relating to this sort of issue but they're all quite old and apparently fixed, while this is happening on new 13.04 installations on both desktop and laptop.

    Read the article

  • Problems running Ubuntu 12.10 in VMWare Player 5

    - by Tiim
    I'm trying to install Ubuntu 12.10 in VMWare Player 5. My pc is running Windows 7. The installation seems to go okay, but when I boot up ubuntu, although it starts up and I see the desktop, I can't see any task bar or basically anything other than the default wallpaper in the background. When I move my cursor across the screen towards the edges, the desktop flickers and extremely distorted and pixelated objects appear momentarily. I've tried uninstalling everything, downloading it all again and reinstalling, but I get the exact same problem. Does anyone recognise this issue? Is there a more stable pairing than Ubuntu 12.10 & WMWare Player 5.0.1, perhaps?

    Read the article

  • X crash at login for 1 user

    - by marxjohnson
    User switching wasn't working on my 12.04 LTS desktop (just dropped me to TTY8 with a blinking cursor) so I tried to manually start a second X session by logging in to TTY6 and running startx -- :1. This didn't work either, and my machine locked up. Now when I try to log in as the second user from LightDM, X instantly crashes and I'm thrown back to the login screen. Other accounts on the machine work fine, and it happens for every desktop environment. I've had a poke around in my home directory, but I can't see anything obvious to change/delete to get it working again. Can anyone advise please?

    Read the article

  • Ubuntu 13.10 - Black screen after logging in after installing nVidia drivers

    - by Javacow
    I recently installed Ubuntu 13.10 in a dual-boot with Windows 7, so I'm still quite new to using Linux. Most things were working fine, and I could log in normally (apart from the first login after install, which spent about 2 minutes on a black screen before going to the desktop). I installed the restricted Nvidia drivers with the command: sudo apt-get install nvidia-current Since then, after I enter my password and log in (the login screen itself works perfectly), I get a black screen with the cursor and nothing happens from that point onwards. Basically, what I would like to know is how to get back to the normal Ubuntu desktop and (hopefully) still be able to use Nvidia drivers.

    Read the article

  • Apt Configuration problem

    - by Paul
    I am trying to load v11.04 onto my HP desktop hard drive using a USB drive. I used the same USB drive to load the system on my Dell netbook and it loaded successfully. But trying to load on the desktop I get the error " Apt configuration problem. An attempt to configure apt to install additional packages from the CD failed" The installer then crashes. I have nothing in the CD drive. I can boot the system from the USB drive and it works fine. I have tried all options on the load with the same crash each time. I baffled as I successfully loaded my netbook with the same USB with no problems.

    Read the article

  • 12.04 LTS won't install from CD

    - by Rob Hays
    I've been trying to install 12.04 LTS onto a Dell with a PIII from CD. Booting from the CD the install gets through the "Who are you" process, begins copying files. The progress bar gets as far as the last period in "Copying files...". The box clears, and an error box comes up "The installer has encountered an unrecoverable error. A desktop session will now be run so that you may investigae the problem or try installing again." When I try to install from this desktop session, the install gets to the same point, the copying files box closes, and then just stops. The pointer is busy, the cd drive spins up occaisonally with no data transfer, no hard drive activity. When I boot from the CD and access the disk boot menu, the disk checks good, memory checks good ( I upgraded the original memory to 512 mb). I also updated the bios to the newest from Dell. This is an older L866r, but should meet the requirements.

    Read the article

  • How do I remove nitrogen back to Ubuntu's default wallpaper manager?

    - by jonalmeida
    I was using nitrogen when I using openbox, but now I'm trying to stop it from setting the wallpaper so I can use the Ubuntu default (i.e via the System Settings panel). What I've done so far: sudo apt-get remove nitrogen Removed nitrogen configs at ~/.config/nitrogen/ Checked to make sure that draw_background in /desktop/gnome/background was checked So far it still doesn't work. I can't right-click on the desktop, so I'm guessing that I've missed something to get Nautilus back to drawing icons on the screen. Any help is much appreciated. Thanks!

    Read the article

  • ubuntu server in a vm, can't connect to internet

    - by jessh
    I'm attemtping to host my own development web server in a virtual box guest, Ubuntu Server. I would like this virtual machine to be accessible from not only my home network, but outside the LAN as well. As such, I've set up a static IP (so I can later forward ports to this static IP.) My virtual box settings have this vm only using one adapter -- in bridged mode. Here's what my /etc/network/interfaces looks like: iface eth0 inet static address 10.0.1.203 /*this is outside the DHCP range*/ netmask 255.255.255.0 gateway 10.0.1.1 network 10.0.1.0 broadcast 10.0.1.255 dns-nameservers: 8.8.8.8 8.8.8.4 Here's what the output of ifconfig looks like: https://dl.dropbox.com/u/2241201/locker/ubuntu.png My Host is a mac mini, running OS X 10.7. From within the guest, if I ping google.com: $ ping google.com # outputs 'ping: unknown host google.com' immediately Why am I unable to access the web?

    Read the article

  • Welcome to our Friday tips series!

    - by Chris Kawalek
    Today we're starting a brand new blog series. For your Friday afternoon reading, we'll be posting a technical tip or question and answer on a technical topic. We'll start by introducing ideas on our own, but we'd really like it if you were involved and asked us questions via Twitter! Tag your tweet with #AskOracleVirtualization and we'll consider your question for the blog. Today's tip is on Storage and Oracle Virtual Desktop Infrastructure: Question: I run Oracle Virtual Desktop 3.4.1 on Solaris and use a local ZFS storage pool.  How should I configure my ZFS ARC cache?  Answer by John Renko, Consulting Developer, Oracle: Oracle recommends about 5G of ARC cache per template in use to achieve up to a 90% disk read offload. Set your ARC min=max to reserve the maximum amount of your remaining memory for your running VMs. In /etc/system: set zfs:zfs_arc_min = 5368709120 set zfs:zfs_arc_max = 5368709120 The amount you need to reserve will depend on your template but this has proven to be a great start for a typical windows 7 VM running productivity applications.

    Read the article

  • How do I set up a headless server via VNC?

    - by Joe
    I'm trying to configure my system to run headless and access the desktop via VNC when necessary. It seems everytime I unplug the monitor while ubuntu is running, the system freezes and I am forced to do a hard shut down. If I start the computer up without a monitor it won't boot up all the way and I am still unable to access the desktop through VNC. I am able to VNC to it while there is a monitor attached to it. Automatic login is enabled. I want to VNC into my ubuntu machine without a monitor.

    Read the article

  • Setting umask globally

    - by DevSolar
    I am using a private user group setup, i.e. a user foo's home directory is owned by foo:foo, not foo:users. For this to work, I need to set the umask to 002 globally. After a quick grep -RIi umask /etc/*, it seemed for a moment that modifying the UMASK entry in /etc/login.defs should do the trick. It does, too -- but only for console logins. If I log in to my desktop, and open a terminal there, I still get to see the default umask 022. Same goes for files created from apps started through the menu. Apparently, the display manager (or whatever X11 component responsible) does source some different setting than a console login does, and damned if I could tell which one it is. (I tried changing the setting in /etc/init.d/rc, and no, it did not help.) How / where do I set umask globally, so that the X11 desktop environment gets the memo as well?

    Read the article

  • Reasons to Use a VM For Development

    - by George Stocker
    Background: I work at a start-up company, where one team uses Virtual Machines to connect to a remote server to do their development, and another team (the team I'm on) uses local IIS/SQL Server 2005/Visual Studio installations to conduct work. Team VM is located about 1000 miles from Team Non-VM, and the servers the VMs run off of are located near Team VM (Latency, for those that are wondering, is about 50ms). A person high in the company is pushing for Team Non-VM to use virtual machines for programming, development, and testing. The latter point we agree on -- we want Virtual Machines to test configurations and various aspects of the web application in a 'clean' state. The Problem: What we don't agree on is having developers using RDP to connect to a desktop remotely that contains Visual Studio, SQL Server, and IIS to do the same development we could do locally on our laptops. I've tried the VM set-up, and besides the color issue, there is a latency issue that is rather noticeable, not to mention that since we're a start-up, a good number of employees work from home on occasion with our work laptops, and this move would cut off the laptops. They'd be turned in. Reasons to Use Remote VMs for Development (Not Testing!): Here are the stated reasons that this person wants us to use VMs: They work for TeamVM. They keep the source code "safe". If we want to work from home, we could just use our home PCs. Licenses (I don't know what the argument is, only that it's been used). Reasons not to use Remote VMs for Development: Here are the stated reasons why we don't want to use VMs: We like working from home. We get a lot done on our own time. We're not going to use our Home PCs to do work related stuff. The Latency is noticeable. Support for the VMs (if they go down, or if we need a new VM) takes a while. We don't have administrative privileges on the VM, and are unable to change settings as needed. What I'm looking for from the community is this: What reasons would you give for not using VMs for development? Keep in mind these are remote VMs -- this isn't a VM running on a local desktop. It's using the laptop (or a desktop) as a thin client for a remote VM. Also, on the other side of the coin: Is there something we're missing that makes VMs more palatable for development? Edit: I think 'safe' is used in term of corporate espionage, or more correctly if the Laptop gets stolen, the person who stole would have access to our source code. The former (as we've pointed out, is always going to be a possibility -- companies stop that with litigation, there isn't a technical solution (so far as I can see)). The latter point is ( though I don't know its usefulness in a corporate scenario) mitigated by Truecrypt'ing the entire volume.

    Read the article

  • Mail being sent as root on Ubuntu 14.04

    - by Benjamin Allison
    I'm really struggling with this. I'm trying to set up this server to send mail using Gmail's SMTP. Google keeps bouncing the messages, saying that that Authentication is required: smtp.gmail.com[74.125.196.109]:25: 530-5.5.1 Authentication Required. Learn more at smtp.gmail.com[74.125.196.109]:25: 530 5.5.1 http://support.google.com/mail/bin/answer.py?answer=14257 But it seems my server is trying to send mail as [email protected]. I'm baffled. Here's what I've done so far: Updated mail.cf relayhost = [smtp.gmail.com]:587 smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd smtp_sasl_security_options = noanonymous smtp_tls_CAfile = /etc/postfix/cacert.pem smtp_use_tls = yes Created /etc/postfix/sasl_passwd: [smtp.gmail.com]:587 [email protected]:password Then did the following: sudo chmod 400 /etc/postfix/sasl_passwd sudo postmap /etc/postfix/sasl_passwd cat /etc/ssl/certs/Thawte_Premium_Server_CA.pem | sudo tee -a /etc/postfix/cacert.pem service postfix restart I can't for the life me get a mail message to send, or change the default mail user from [email protected] to [email protected] (FWIW, I'm using Google Apps, that's why it's not a .gmail address).

    Read the article

  • Sniff packets using tcpdump

    - by denisk
    I have a completely noob question. I want to see all packets that come to my computer from particular site (google.com). So I start tcpdump sudo tcpdump -i eth0 host google.com and enter google.com in a browser and hit enter - nothing gets captured. I can't figure out why it happen. What do I do wrong? Edit It appeared that I was listening to the wrong interface. I had changed eth0 to any and it worked. It was ppp1 that needed listening. Thanks for your answers!

    Read the article

  • Social Media Stations for Partners

    - by Oracle OpenWorld Blog Team
    By Stephanie Spada One of our exciting additions to this year’s Oracle Partner Network Exchange @ OpenWorld are Social Media Stations.  Partners have the opportunity to get customized, face-to-face expert advice on how they can better engage their customers and find new prospects online using social media tools.When: Sunday, September 30Time: 3:00 p.m.–5:00 p.m.Where: Moscone South, Esplanade levelWhen: Monday, October 1Time:  9:30 a.m.–6:00 p.m.Where: Moscone South, OPN Lounge, Exhibitor levelEach customized social media consultation will take only 25 minutes. Here’s how it works:·    Partners check in with a Social Media Rally coordinator who will assess needs and make the right connections for each session·    Partners go to the Photo Station, where a headshot will be taken that can be used on social profiles, Websites or for articles and posts across the Web·    Partners meet with the One-2-One consultants who will walk them through how they’re using social media today and what next steps could beSocial media channels/methods discussed can include Google+, Google Alerts, Google Analytics, Facebook, LinkedIn, Search Engine Optimization, Twitter, and more.  With so many choices, partners can decide how to focus their time.To get the most out of the Social Media Stations, partners should:·    Wear appropriate attire for the headshot photo·    Bring log-in information for social platforms they want to discuss·    Come prepared with questions for the One-2-One consultation so session time can be maximizedFor questions, or to schedule a session ahead of time, partners should send an email to: [email protected].

    Read the article

  • Different behavior when launched from terminal instead of Unity launcher

    - by dgkontopoulos
    I have written two Perl/Gtk programs. When launched from the dash menu, they run just fine. However, if I try to launch them from terminal using the very same command found in their .desktop files, their Unity launcher will be blurry and will remain inactive when clicked, if I keep it in the Unity bar. The problem is solved if the Exec part of the desktop file is replaced with perl path_to_script However, that leads to other problems, including a lintian warning and forcing all Perl GUI applications running from terminal to adopt the same launcher. This issue is quite annoying since one of the programs relies on a different (Python) program in order to be launched and this results in having a blurry and inactive launcher.

    Read the article

  • In Windows Vista, when starting any and every program, a "bad image" error is generated

    - by Mark Hatton
    I have a problem where any program is started under Windows Vista, the following error message is generated Bad Image C;\PROGRA~1\Google\GOOGLE~3\GOEC62~1.DLL is either not designed to run on Windows or it contains an error.Try installing the program again using the original installation media or contact your system administrator or the software vendor for support. This happens for every program started, including those that start automatically at boot time. My Google-fu is failing to solve this for me. I have already tried an "sfc /scannow" which did find some problems, but it said that it could not correct them. What might cause this problem? How might it be resolved?

    Read the article

  • Chromium web-app creation doesn't work in 12.10?

    - by speter
    If you create an application of a website in Chromium and choose "desktop", a .desktop file is created. In 12.04, you were able to move it to ~/,local/share/applications/ and then start it from the Dash or the launcher. In my fresh 12.10 installation, this doesn't work anymore. I think the line that is misinterpreted is Exec=/usr/bin/chromium-browser --app=http://buymeapie.com/. If you try to run it from a terminal, you get the following result: 20:32 ~ speter Exec=/usr/bin/chromium-browser --app=http://buymeapie.com/ bash: --app=http://buymeapie.com/: Datei oder Verzeichnis nicht gefunden (English: "File or directory not found"). Can anyone explain or knows a workaround?

    Read the article

  • Service Layer - how broad should it be, and should it also be used from the local application?

    - by BornToCode
    The background: I need to build a desktop application with some operations (CRUD and more) (=winforms), I need to make another application which will re-use some of the functions of the main application (=webforms). I'm using service layer for reusing my functions. The service is calling the functions on the BL layer (correct me if I'm doing this wrong). so my desktop has 4 projects - DAL, BL, UI, WEBSERVICES. The dilemma (simple but I still need some more experienced opinions): In my main winform UI - should I call the functions from the BL - bl.getcustomers(), or do it similar to how I call it in the webform, and call the functions from the service - webservices.getcustomers? Should I create a service for every single function on the BL even if I need some of the functions only in one UI? for example - should I create services for all the CRUD operations, even though I need to re-use only update operation in the webform? YOUR HELP IS MUCH APPRECIATED

    Read the article

  • How to disable Spotlight content indexing in Mac OS

    - by o.v.
    From Windows experience, I could always elect Live search to only index file names not their content. Is this something that can be done with Spotlight on a Mac? It used to index absolutely everything, for instance it would return a bunch of video files for any obscure character combination typed into the search field. Right now I've disabled Spotlight entirely as per this answer, but it seems to have disabled searching altogether. For instance, Finder is yet to locate any .pdf files in a small directory as I'm typing this question (unlike windows search which would still be able to work even with indexing disabled) Alternatively, if there is any way (including a trusted third-party app) that will index file names and metadata e.g. ID3 tags that would likely be the preferred option.

    Read the article

  • I am unable to use the "wubi" install I get the message "ERROR TaskList: Cannot download the metalink and therefore the ISO"

    - by pat
    used WUBI a few months ago on both XP and win7 systems with no problem. Unable to install on either for the last 2 weeks? p From the log 09-05 11:36 DEBUG CommonBackend: Could not find any ISO or CD, downloading one now 09-05 11:36 DEBUG TaskList: New task get_metalink 09-05 11:36 DEBUG TaskList: ### Running get_metalink... 09-05 11:36 DEBUG downloader: downloading http://cdimage.ubuntu.com/xubuntu/releases/12.04/release/xubuntu-12.04-desktop-amd64.metalink > C:\ubuntu\install 09-05 11:36 ERROR CommonBackend: Cannot download metalink file http://cdimage.ubuntu.com/xubuntu/releases/12.04/release/xubuntu-12.04-desktop-amd64.metalink err=[Errno 14] HTTP Error 404: Not Found

    Read the article

< Previous Page | 731 732 733 734 735 736 737 738 739 740 741 742  | Next Page >