Search Results

Search found 24931 results on 998 pages for 'information visualization'.

Page 550/998 | < Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >

  • Troubleshooting VMware on Ubuntu

    Summary of different problems while using VMware products on Ubuntu. This article is going to be updated from time to time with new information about running VMware products more or less smoothly on Ubuntu. Following are links to existing articles: Running VMware Player on Linux (xubuntu Hardy Heron) Running VMware Server on Linux (version 1.0.6 on xubuntu) Using ext4 in VMware machine   VMware mouse grab/ungrab problem (Source: LinuxInsight) Upgrading GTK library in Ubuntu since Karmic Koala gives you a strange mouse behaviour. Even if you have "Grab when cursor enters window" option set, VMware won't grab your pointer when you move mouse into the VMware window. Also, if you use Ctrl-G to capture the pointer, VMware window will release it as soon as you move mouse around a little bit. Quite annoying behavior... Fortunately, there's a simple workaround that can fix things until VMware resolves incompatibilities with the new GTK library. VMware Workstation ships with many standard libraries including libgtk, so the only thing you need to do is to force it to use it's own versions. The simplest way to do that is to add the following line to the end of the /etc/vmware/bootstrap configuration file and restart the Workstation. export VMWARE_USE_SHIPPED_GTK="force" The interface will look slightly odd, because older version of GTK is being used, but at least it will work properly. Note: After upgrading a new Linux kernel, it is necessary to compile the VMware modules, this requires to temporarily comment the export line in /etc/vmware/bootstrap.

    Read the article

  • Does a successful exit of rsync -acvvv s d guarantee identical directory trees?

    - by user259774
    I have two volumes, one xfs, and another ntfs - ntfs was empty, and xfs had 10 subitems. I needed to sync them. I initially copied a few of the subitems by dragging them over in a gui fm. Several of the direct descendants which i had dragged finished, apparently. One I stopped before it was done, and the rest I cancelled while it still appeared to be gathering information about the files. Then I ran rsync -acvvv xmp/ nmp/, where xmp and nmp are the volumes' respective mountpoints, which exited with a 0 status. find xmp -printf x | wc -c and find nmp -printf x | wc -c both return 372926. My question is: Am I guaranteed that the two drives' contents are identical?

    Read the article

  • Script errors when run by launchd at startup, but not when run in Terminal

    - by Mechcozmo
    I'm attempting to create a RAM disk that loads the previous contents when the system starts up, and every six hours writes the contents to a disk image. Currently, when you run the script from the terminal ("sudo bash LogToRAM.sh") everything works fine. But when run from launchd during startup, it doesn't work. Here's the lines from the log; the first line just gives some idea as to where in the boot process we are: SecurityAgent[202] Showing Login Window com.mechcozmo.LogToRAM[51] + /Developer/usr/bin/SetFile -a V /Volumes/LogfileRAMdisk com.mechcozmo.LogToRAM[51] ERROR: File Not Found. (-43) on file: /Volumes/LogfileRAMdisk com.mechcozmo.LogToRAM[51] + /usr/sbin/asr -source '/Library/Application Support/LogToRAM/RAMdisk_store.dmg' -target /Volumes/LogfileRAMdisk/ -noverify Here is the script and plist file in question. Note that 'set -vx' is up at the top of the script; it give a lot of information about what is happening in the script. My current theory is that the /Volumes directory does not exist at this stage of the boot process, but that seems unlikely to be honest.

    Read the article

  • File store: CouchDB vs SQL Server + file system

    - by Andrey
    I'm exploring different ways of storing user-uploaded files (all are MS Office documents or alikes) on our high load web site. It's currently designed to store documents as files and have a SQL database store all metadata for those files. I'm concerned about growing out of the storage server and SQL server performance when number of documents reaches hundreds of millions. I was reading a lot of good information about CouchDB including its built-in scalability and performance, but I'm not sure how storing files as attachments in CouchDB would compare to storing files on a file system in terms of performance. Anybody used CouchDB clusters for storing LARGE amounts of documents and in high load environment?

    Read the article

  • VGA no signal on LCD monitor attached to laptop

    - by Paul
    I bought a new Asus vh242h LCD monitor for use with my lenevo T60 laptop running XP professional. Display info under control panel says "Intel(R) 945GM Express Chipset Family". I am connecting via VGA. When I connect the monitor I get "VGA no signal" and the monitor screen stays blank. I have selected the monitor as the display device on the laptop. The information on the monitor displays the correct screen resolution from the laptop, so the monitor is communicating in some way with the laptop. I've successfully tried the monitor with my Dell inspirion 1525 running Windows Vista. I've change the VGA cable to one I know works. Tried different resolutions. I cannot find and specific drivers on the internet for this monitor, so I assume it should work with Plug and Play. Does anyone know what the problem could be?

    Read the article

  • Is Dell's server software bundle necessary? (Poweredge 2950 in my case)

    - by bwerks
    Hi all, Dell includes a fair amount of software with its servers, but I'm having a hard time determining from the documentation what each of them does, and whether or not I should install it. Dell's support site (unless I'm doing it wrong) seems fairly opaque to me and its offerings fairly unstandardized in terms of their usage, so if possible I'd like to stray away from them. Specifically, I'm curious if any of the features offered are duplicated in something like Microsoft System Center. For additional background information, I'm working with a Poweredge 2950 that was just rebuilt with an expanded raid-6, but initially I just installed Server 2008 R2 directly instead of using the Build and Update utility. There's nothing of use on it at the moment so I'm totally open to wiping it again.

    Read the article

  • chdir warning when opening .tar file on OS X

    - by denonth
    I need to unarchive a file to the /Developer folder. Install Qt for iOS SDK The Qt for iOS SDK has been configured to be installed in the default Xcode installation location /Developer. It is not possible to install the SDK into another location without first rebuilding it, as the install location is contained within the qmake executable, and that is built as part of Qt. To install the Qt for iOS SDK, open ‘Terminal’ and type the following from the command­-line: tar –xf qt­-everywhere-­ios­-4.8.0­-xxx.tar.gz –C /Developer (where xxx is an identifier which can be used to determine the build of the iOS SDK eg. arm7-­-nossl) This will install the Qt for iOS SDK into the following path: /Developer/Platforms/iPhoneOS.platform/Developer/usr/share/qt­-everywhere­-ios­-4.8.0 When I perform the operation I get the information: Lions-Mac:Documents User$ tar -xf qt-everywhere-ios-4.8.0-arm7-nossl.tar.gz -C /Developer tar: could not chdir to '/Developer' Any idea what is wrong?

    Read the article

  • OAuth2 vs Public API

    - by Adam Tannon
    My understanding of OAuth (2.0) is that its a software stack and protocol to allow 2+ web apps to share information about a single end user. User A is a member of Site B and Site C; Site B wants to fetch some data from Site C about User A, and this is where OAuth steps in. So first off, if this assessment is incorrect, please begin by clarifying this for me and correcting me! Assuming I'm on the right track, then I guess I'm not seeing the need for OAuth to begin with (!). I'm sure I'm just not seeing the "forest through the trees" here, but the way I see it, couldn't Site C just expose a public API that Site B could use to fetch the same data (sans OAuth)? If Site C required user credentials to access the data, could this public API just use HTTPS for secure transport and require username/password as a part of each API call? Again, I'm sure I'm missing something, but I'm just not understanding why I would need OAuth when a secure, public API written and exposed by Site C seems more than capable of delivering what Site B needs regarding User A. In general, I'm looking for a set of guidelines to go by when deciding to choose between using OAuth for my web apps or just writing my own web service ( exposing public API). Thanks in advance!

    Read the article

  • Why can't I compare two Texture2D's?

    - by Fiona
    I am trying to use an accessor, as it seems to me that that is the only way to accomplish what I want to do. Here is my code: Game1.cs public class GroundTexture { private Texture2D dirt; public Texture2D Dirt { get { return dirt; } set { dirt = value; } } } public class Main : Game { public static Texture2D texture = tile.Texture; GroundTexture groundTexture = new GroundTexture(); public static Texture2D dirt; protected override void LoadContent() { Tile tile = (Tile)currentLevel.GetTile(20, 20); dirt = Content.Load<Texture2D>("Dirt"); groundTexture.Dirt = dirt; Texture2D texture = tile.Texture; } protected override void Update(GameTime gameTime) { if (texture == groundTexture.Dirt) { player.TileCollision(groundBounds); } base.Update(gameTime); } } I removed irrelevant information from the LoadContent and Update functions. On the following line: if (texture == groundTexture.Dirt) I am getting the error Operator '==' cannot be applied to operands of type 'Microsoft.Xna.Framework.Graphics.Texture2D' and 'Game1.GroundTexture' Am I using the accessor correctly? And why do I get this error? "Dirt" is Texture2D, so they should be comparable. This using a few functions from a program called Realm Factory, which is a tile editor. The numbers "20, 20" are just a sample of the level I made below: tile.Texture returns the sprite, which here is the content item Dirt.png Thank you very much! (I posted this on the main Stackoverflow site, but after several days didn't get a response. Since it has to do mainly with Texture2D, I figured I'd ask here.)

    Read the article

  • Added resolution not working after upgrading to 12.04

    - by David
    After upgrading, my screen resolution (added by xrandr comands on the start) did not work like before. Messages appear showing some errors (that i have never had in 11.10). "No se pudo aplicar la configuración almacenada para los monitores"/"Can't apply the stored configuration for the monitors." This script didn't work either. xrandr --newmode "1280x1024_60.00" 109.00 1280 1368 1496 1712 1024 1027 1034 1063 -hsync +vsync xrandr --addmode VGA1 1280x1024_60.00 xrandr --output VGA1 --mode 1280x1024_60.00 I also tryed deleting monitors.xml but, nothing. This only erase the window message. It's been sayd that ##It's a normal buggy and well know problem for Pc's with Intel integrated video cards.## The new version of gnome-settings-daemon stores its configuration information in dconf rather than gconf. I tryed something, but the problem persist. This is what i did. Install the dconf-tools package, and then run dconf-editor. In the tree on the left, navigate org - gnome - settings-daemon - plugins - xrandr. Uncheck the active checkbox. restart your XServer (Ctrl+Alt+Backspace) (It didn't worked out for me, but it may be helpful to someone)

    Read the article

  • Approach for monitoring internet backbone traffic volume

    - by Greg Harman
    I'm interested in getting a picture of relative volume across different internet backbones. In particular, I'd like to see how traffic volume over a given route differs over the course of a day or from one day to the next. InternetTrafficReport.com is the closest approximation to this that I've found online, and their approach is to test ping times to a number of key routers from several geographically-dispersed servers. This sounds like one straightforward way to measure, but I don't have several geographically-dispersed servers. Is there a different approach for sampling this type of information from a single server?

    Read the article

  • What can cause peaks in pagetables in /proc/meminfo ?

    - by Fuzzy76
    I have a gameserver running Debian Lenny on a VPS host. Even when experiencing a fairly low load, the players start experiencing major lag (ping times rise from 50 ms to 150-500 ms) in bursts of 3 - 10 seconds. I have installed Munin server monitoring, but when looking at the graphs it looks like the server has plenty of CPU, RAM and bandwidth available. The only weird thing I noticed is some peaks in the memory graph attributed to "page_tables" which maps to PageTables in /proc/meminfo but I can't find any good information on what this might mean. Any ideas what might be causing this? If you need any more graps, just let me know. The interrupts/second count is at roughly 400-600 during this period (nearly all from eth0). The drop in committed was caused by me trying to lower the allocated memory for the server from 512MB to 256MB, but that didn't seem to help.

    Read the article

  • Laptop overheating within minutes of start up

    - by Spik330
    I have a Dell Windows 7 Home Prem with an I7-720QM. More information on the computer can be found here http://www.dell.com/support/home/us/en/04/product-support/servicetag/51CVCN1/configuration The Problem I am having is the computer will over heat unnaturally fast. From the time it takes from boot to when i can run my diagnostic tools which takes about two minutes the cpu temp is 86c after a few more minutes the cpu temp will reach 100 and the computer will black screen shut down. In total the the laptop can only be run for 3-5 minutes before completely shutting off. During this time there is nothing extensive running. After the laptop shuts down you have to wait for it to cool down or it will shut off even faster sometimes 7-15 seconds well still in the boot screen. Does anyone know what could be the problem maybe a sensor or is the computer fried?

    Read the article

  • Unable to mount USBDRIVE Error creating moint point: Permission denied

    - by steve
    Whenever I plug a usb into my computer a window pops up and says Unable to mount [Name of USB] Error creating moint point: Permission denied steve@goliath:/$ uname -a Linux goliath 3.2.0-32-generic #51-Ubuntu SMP Wed Sep 26 21:33:09 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux steve@goliath:/$ sudo fdisk -l WARNING: GPT (GUID Partition Table) detected on '/dev/sda'! The util fdisk doesn't support GPT. Use GNU Parted. Disk /dev/sda: 120.0 GB, 120034123776 bytes 255 heads, 63 sectors/track, 14593 cylinders, total 234441648 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0f716ee1 Device Boot Start End Blocks Id System /dev/sda1 1 234441647 117220823+ ee GPT WARNING: GPT (GUID Partition Table) detected on '/dev/sdb'! The util fdisk doesn't support GPT. Use GNU Parted. Disk /dev/sdb: 1500.3 GB, 1500301910016 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930277168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0f710ee1 Device Boot Start End Blocks Id System /dev/sdb1 1 2930277167 1465138583+ ee GPT Disk /dev/sdc: 16.0 GB, 16005464064 bytes 74 heads, 10 sectors/track, 42244 cylinders, total 31260672 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xc3072e18 Device Boot Start End Blocks Id System /dev/sdc1 8064 31260671 15626304 c W95 FAT32 (LBA) steve@goliath:/$ sudo mkdir /media/external mkdir: cannot create directory `/media/external': Permission denied steve@goliath:/$ sudo mkdir /media/usb0 mkdir: cannot create directory `/media/usb0': Permission denied steve@goliath:/$ sudo ls -l / | grep media drwxr-xr-x 3 root root 4096 Oct 3 22:48 media steve@goliath:/$ ls /media/ -a . .. MediaShare MediaShare is the the directory on my server that has all my movies and music. If there is any information I left out please let me know.

    Read the article

  • problem with using polish keyboard with synergy 1.4.2

    - by Lukasz
    I installed synergy on Windows 7 as server and on Windows Vista as client. On both I can use polish keyboard using local keyboard. When I am using remote keyboard via synergy polish characters are not working (I mean in example S + Alt Gr or S + left Alt + left ctrl) I've used synergy about 1 year ago and I cannot recall that kind of problem so I think the problem is only in current version. Searching through your web site and googled I found only identical problem 5 years ago with information that it was solved. Please help me to sort it out.

    Read the article

  • USB Ports In Wrong Mode, How To Use usbmodeswitch?

    - by user86872
    I haven't had access to my USB ports as media devices for a couple days now. I've been reading and researching everything I can find but I can't find a good guide for usbmodeswtich or usbms that I can decipher. The USB's are fine for power, but won't support my android phone as a media device, which is killing me because I use adb everyday, and won't support my plug and play mouse any longer. Not sure what caused the switch, though I think it may be related to the suspend issue I've read about, but the solutions in those threads I read also didn't work. Below is my system information and details. System: Ubuntu 12.04, 64-bit, Dedicated Machine Machine: HP-Pavillion g6 notebook, AMD A6 Quad Core Processor USBs used for: Cooling dock, Android Debug Bridge, Wireless Mouse Attempted Mod Probe, udev restart, unable to attempt lsusb due to my own lack of knowledge. :) Last Attempt Readout: ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe -r usbhid && sleep 5 && sudo modprobe usbhid ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe -r usb-storage ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe usb-storage ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo restart udev udev start/running, process 2624 ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 002: ID 0461:4de7 Primax Electronics, Ltd webcam Any help would be greatly appreciated!

    Read the article

  • Do we have enough time to build an electric car future?

    - by julien.groues
    A recent article from Greenbang has posed the question 'Do we have enough time to build an electric car future?'. The writer discusses that, although the future of transport might lie with electric cars, there is concern regarding whether we'll be able to build the market and infrastructure required to support them, before carbon and oil constraints create difficulties in powering the vehicles. Of course, the increasing use of Electric vehicles (EVs) is going to put excessive pressure on energy grids, as large volumes of electricity will need to be directed to charging points, which in turn must handle fluctuating demand at peak times. EVs are increasing in popularity as a sustainable method of transport to reduce carbon consumption, and electric utilities will have the opportunity, and the challenge, to quickly determine the best methods to fuel these vehicles and accommodate the associated increases in demand for energy. Critically, efficient software is required to provide diagnostic and predictive capabilities related to EV refuelling - for example, anticipated electricity flow will need to be addressed as the number of EVs on the road increases, and electricity will need to be directed to specific areas on-demand as vehicles attempt to recharge en-mass. But a smart grid infrastructure can meet these demands, intelligently. The implementation of a smart grid is not in the distant future, it is an achievable reality for utilities via simple installation of new software and technologies, which can be done incrementally for those facing existing legacy systems or concerned with upfront costs. The smart grid is integral to the monitoring and control of energy use as well as the future-proofing of the energy grid. A smart grid will be critical to meeting the electricity requirements of new EVs and will ensure their successful deployment by providing a reliable foundation for the data handling required to record and manage electricity distribution - from recording and assessing energy usage, to analysing data and sharing information with consumers via green billing. http://www.greenbang.com/do-we-have-enough-time-to-build-an-electric-car-future_14248.html

    Read the article

  • How do I tell if my firewire connection is running as 400 or 800?

    - by Tom
    I have a MacBook Pro with FireWire 800 and a freecom external harddrive that has USB3, FireWire 400 and 800. I am using a Nikkai FireWire 800 cable that has 800 connector on one end and a 400 connector on the other end. The 800 connector is attached to the MacBook pro and the 400 connector is attached to the freecom drive. Is there any way to tell what connection has been established? I looked at disk utility and it simply said 'FireWire'. Is there a command-line tool that would give more information? If it's 400, I plan to swap the cable for 800 connector at both ends.

    Read the article

  • Routing Essentials

    - by zharvey
    I'm a programmer trying to fill a big hole in my understanding of networking basics. I've been reading a good book (Networking Bible by Sosinki) but I have been finding that there is a lot of "assumed" information contained, where terms/concepts are thrown at the reader without a proper introduction to them. I understand that a "route" is a path through a network. But I am struggling with visualizing some routing-based concepts. Namely: How do routes actually manifest themselves in the hardware? Are they just a list of IP addresses that get computed at the network layer, and then executed by the transport? What kind of data exists in a so-caleld routing table? Is a routing-table just the mechanism for holding these lists of IP address (read above)? What are the performance pros/cons for having a static route, as opposed to a dynamic route?

    Read the article

  • User accounts in FTP

    - by Brad
    I have an FTP server(proftpd on debian) that I'm going to allow a couple friends access to, and I want some safety nets in place, just in case. These are some of the things I'd like to do: Jail the accounts to their home directories and impose a cap on the amount of data they can upload Allow them access to a shared folder(via symlink or something) where they have full access(Also with a storage cap, but larger) Allow my own account full access to the system(Using groups I guess) Not allow anonymous access, or allow it with its own folder, separate from the shared user folder Currently, I've got the accounts set up and jailed, but it seems like the symlink that I put in is not allowing them to visit the shared folder. I suppose this has to do with them not having read permissions anywhere but their own home directories, or maybe it's something else, I'll continue to look into it and provide any information that is requested. Is what I'm trying to do possible? Any tips or resources that you can share are appreciated. Thanks.

    Read the article

  • Oracle Value Chain Summit 2014 - Early Bird Registration Now Open

    - by Pam Petropoulos
    Get the Best Rate on the Biggest Supply Chain Event of the Year. Register Now and save $200. Join more than 1,000 of your peers at the Value Chain Summit to learn how smart companies are transforming their supply chains into information-driven value chains. This unparalleled experience will give you the tools you need to drive innovation and maximize revenue. Date: February 3-5, 2014 Location: San Jose McEnery Convention Center Click here to learn more Thought-Leading Speakers Top minds and tech experts across industries will share the secrets of their success, firsthand. Prepare to be inspired by speakers like Geoffrey Moore, business advisor to Cisco, HP, and Microsoft and best-selling author of six books, including Crossing the Chasm. Customized Experiences Choose from more than 200 sessions offering deep dives on every aspect of supply chain management: Product Value Chain, Procurement, Maintenance, Manufacturing, Value Chain Execution, and Value Chain Planning. Unrivaled Insight & Solutions Hands-on workshops, product demonstrations, and interactive breakouts will showcase new value chain solutions and best practices to help you: -  Grow profit margins -  Build products – faster and cheaper -  Expedite delivery -  Increase customer satisfaction You don't want to miss this once-a-year event. Register Now to secure the Early Bird rate of $495 - the lowest price available.

    Read the article

  • How to forward AIM to Gmail

    - by iamjames
    Still have an old AIM email address lying around and would like to forward it to Gmail?  Here's how: 1.  Login to your AIM and click on Settings on the far right 2.  In the left menu click IMAP and POP  3.  This shows you your IMAP and POP setup information for AIM.  We're going to put this into your Gmail account so your Gmail account will check your AIM account and download all AIM emails. 4.  Login to your Gmail, click Settings and click Accounts and Import 5.  Click "Import mail and contacts".  A new window will pop up asking what account you want to import.  Enter your AIM Email Address and click Continue 6.  The next page asks for your password.  Enter your password and click Continue.  Step 2 asks your Import options.  I'd put a checkmark in "Leave a copy of retrieved message on server".  That way all your mail is still stored on AIM if you ever need it. 7.  Click Start import and you're done.  Next screen says it make take several hours up to 2 days before you start seeing imported messages and can check the status at Settings > Accounts and Import

    Read the article

  • Code &amp; Slides &ndash; SDE &ndash; What&rsquo;s new in Silverlight 4

    - by Timmy Kokke
    Last Tuesday the Software Developers Network – SDN organized another SDE. I’ve had the opportunity to present a session about Silverlight 4. I talked about lots of new features in Silverlight 4 and Expression Blend 4, focused on the Out-Of-Browser features.     The slides of my presentation can be downloaded here.   In my presentation I demonstrated a couple of features from my new pet project “SilverAmp”. This project is based on the legendary WinAmp, but made entirely in Silverlight. I use it to try out many new Silverlight 4 features. It’s not finished yet, but useful already. It runs outside the browser with elevated trust. It reads your local MyMusic folder and uses the TagLib library to read the ID3 tags of the mp3s it finds. SilverAmp is using MEF for extensibility. It uses a custom Window Chrome, designed in Expression Design. And it shows a notification window when a new song starts to play.   In the future it is going to get song information from Last.FM, will be able to show YouTube videos inside itself and it will tweet about what you are playing. The project will be fully documented to function as a reference implementation for the new Silverlight 4 features.   You can download the source on  http://SilverAmp.codeplex.com   Below are two screenshot of SilverAmp.                     If you have any questions, comments,  issues or feature requests let me know.

    Read the article

  • How much the distance and ms can affect on the download speed ?

    - by Prix
    Let's consider A (client) and B (server) where A makes download from B. How much can a bad routing from A to B affect the download speed ? For example A does a tracert to B and get a response of 10 steps where the avg ms is around 300 with 10% packet loss at the 4 step and when the connection is normal the avg from A to B is 10 ~ 30 ms. Could this sort of impact reduce A download speed drasticaly or as long as both side and routes have enough link for the full speed of A from B and vice-versa it should maintain the same speed ? Besides tracert and the ping analyse of A to B what else is used to identify the problem ? If you need extra information please let me know.

    Read the article

  • Configure Unity Lenses and what they search

    - by Sindre
    I'm using Ubuntu 12.10. I've read about a lense (ppa:pydave/unity-lenses) that you can replace with the original files and folders lense, so you can search all your files. Instead of the current which only search used/recently used files and programs. I couldn't get this to work with 12.10, got a bunch of errors when I tried adding the ppa. I would like to set up a lens that can search all my files and folders (from all of my 3 hdd's), one that search through my videos (ability to specify which folders) and the same for music. So basically I would like to set up three specific lenses that each get a set of specified folders that they search through. If this is not possible, is there atleast a way to configure the current Files and Folders lense to ignore certain folders? I don't like when my dash shows files that I don't want to be shown. I should add that I'm completely new to Ubuntu and I apologize beforehand if this information could easily be found. But I wasn't able to find something like this. Edit: I found out how I can use the Privacy application to ignore what I want, so that's sorted now. Sorry for not researching it more. But my question regarding the lenses still stand. All help is greatly appreciated.

    Read the article

< Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >