Search Results

Search found 17990 results on 720 pages for 'virtualization option'.

Page 180/720 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • Dual displays not working - NVidia - Ubuntu 12.4

    - by user75105
    Graphics Card: NVidia 460 GTX. Driver: NVIDIA accelerated graphics driver (version current) I have one DVI monitor, an old Dell LCD from 2005, and one VGA monitor, an Asus ML238H from 2010 whose HDMI port broke. The Asus is plugged into my graphics card's primary monitor slot and is the better monitor even though it is VGA but my computer defaults to the Dell. This happens when I boot as well; the loading screens, the motherboard brand image, etc. are all displayed on the Dell monitor until Windows loads. Then both monitors work. The same thing happened when I booted up Ubuntu 12.4 but I did not see the second monitor when the log-in screen popped up, nor did I when I logged in. I went to System Settings/Displays and my Asus monitor is not an option. I clicked Detect Displays and the Asus is not detected. I looked at the other questions regarding NVIDIA drivers and recalled my problems with Ubuntu a few years ago and decided to check the driver. I went to Additional Drivers to install the proprietary driver and it looks like it's installed and active but I'm still having this problem. There is another driver option, the post-release NVIDIA driver, but that does not fix the problem either. Also, under System Details/Graphics the graphics device is listed as Unknown, which might indicate that it is using an open source generic driver and not the proprietary NVidia driver. But under Additional Drivers it says that I am using the NVidia driver. Any help is appreciated.

    Read the article

  • Options to efficiently synchronize 1 million files with remote servers?

    - by Zilvinas
    At a company I work for we have such a thing called "playlists" which are small files ~100-300 bytes each. There's about a million of them. About 100,000 of them get changed every hour. These playlists need to be uploaded to 10 other remote servers on different continents every hour and it needs to happen quick in under 2 mins ideally. It's very important that files that are deleted on the master are also deleted on all the replicas. We currently use Linux for our infrastructure. I was thinking about trying rsync with the -W option to copy whole files without comparing contents. I haven't tried it yet but maybe people who have more experience with rsync could tell me if it's a viable option? What other options are worth considering?

    Read the article

  • Can't access computer

    - by Pudica
    I'm running Ubuntu 14.04 on an Intel NUC and it won't boot! The last successful boot was earlier today but now each time I try it gets stuck on the Grub menu where it prompts for memory check etc. This is not a dual boot system, so this screen shouldn't ever appear, and it never has before. It's GRUB version 2.02~beta2-9, which is a little disconcerting, as I'm on the stable sources. Unfortunately the keyboard (I've tried 2 keyboards just in case) is not responding at this point in the boot process, so I can't select the "Ubuntu" menu option in Grub. The keyboard works during the bios stage, so I can configure it to boot from USB, and I tried a flash drive with 14.04 on it. The flash drive works in my laptop but is completely ignored by the NUC (I tried all 3 USB ports!). It seems that I have no way of getting into the machine at all! The Intel support site was my first option, but the site is down. I expect it's a long shot, but if anyone has any ideas I'd be very grateful.

    Read the article

  • Beyond Cloud Technology, Enabling A More Agile and Responsive Organization

    - by sxkumar
    This is the second part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain”. In the first part,  I was sharing with you how a broad-based transformation makes cloud more than a technology initiative, I will describe in this section how it requires people (organizational) and process changes as well, and these changes are as critical as is the choice of right tools and technology. People: Most IT organizations have a fairly complex organizational structure. There are different groups, managing different pieces of the puzzle, and yet, they don't always work together. Provisioning a new application therefore may require a request to float endlessly through system administrators, DBAs and middleware admin worlds – resulting in long delays and constant finger pointing.  Cloud users expect end-to-end automation - which requires these silos to be greatly simplified, if not completely eliminated.  Most customers I talk to acknowledge this problem but are quick to admit that such a transformation is hard. As hard as it may be, I am afraid that the status quo is no longer an option. Sticking to an organizational structure that was created ages back will not only impede cloud adoption,  it also risks making the IT skills increasingly irrelevant in a world that is rapidly moving towards converged applications and infrastructure.   Process: Most IT organizations today operate with a mindset that they must fully "control" access to any and all types of IT services. This in turn leads to people clinging on to outdated manual approval processes .  While requiring approvals for scarce resources makes sense, insisting that every single request must be manually approved defeats the very purpose of cloud. Not only this causes delays, thereby at least partially negating the agility benefits, it also results in gross inefficiency. In a cloud environment, self-service access should be governed by policies, quotas that the administrators can define upfront . For a cloud initiative to be successful, IT organizations MUST be ready to empower users by giving them real control rather than insisting on brokering every single interaction between users and the cloud resources. Technology: From a technology perspective, cloud is about consolidation, standardization and automation. A consolidated and standardized infrastructure helps increase utilization and reduces cost. Additionally, it  enables a much higher degree of automation - thereby providing users the required agility while minimizing operational costs.  Obviously, automation is the key to cloud. Unfortunately it hasn’t received as much attention within enterprises as it should have.  Many organizations are just now waking up to the criticality of automation and it still often gets relegated to back burner in favor of other "high priority" projects. However, it is important to understand that without the right type and level of automation, cloud will remain a distant dream for most enterprises. This in turn makes the choice of the cloud management software extremely critical.  For a cloud management software to be effective in an enterprise environment, it must meet the following qualifications: Broad and Deep Solution It should offer a broad and deep solution to enable the kind of broad-based transformation we are talking about.  Its footprint must cover physical and virtual systems, as well as infrastructure, database and application tiers. Too many enterprises choose to equate cloud with virtualization. While virtualization is a critical component of a cloud solution, it is just a component and not the whole solution. Similarly, too many people tend to equate cloud with Infrastructure-as-a-Service (IaaS). While it is perfectly reasonable to treat IaaS as a starting point, it is important to realize that it is just the first stepping stone - and on its own it can only provide limited business benefits. It is actually the higher level services, such as (application) platform and business applications, that will bring about a more meaningful transformation to your enterprise. Run and Manage Efficiently Your Mission Critical Applications It should not only be able to run your mission critical applications, it should do so better than before.  For enterprises, applications and data are the critical business assets  As such, if you are building a cloud platform that cannot run your ERP application, it isn't truly a "enterprise cloud".  Also, be wary of  vendors who try to sell you the idea that your applications must be written in a certain way to be able to run on the cloud. That is nothing but a bogus, self-serving argument. For the cloud to be meaningful to enterprises, it should adopt to your applications - and not the other way around.  Automated, Integrated Set of Cloud Management Capabilities At the root of many of the problems plaguing enterprise IT today is complexity. A complex maze of tools and technology, coupled with archaic  processes, results in an environment which is inflexible, inefficient and simply too hard to manage. Management tool consolidation, therefore, is key to the success of your cloud as tool proliferation adds to complexity, encourages compartmentalization and defeats the very purpose that you are building the cloud for. Decision makers ought to be extra cautious about vendors trying to sell them a "suite" of disparate and loosely integrated products as a cloud solution.  An effective enterprise cloud management solution needs to provide a tightly integrated set of capabilities for all aspects of cloud lifecycle management. A simple question to ask: will your environment be more or less complex after you implement your cloud? More often than not, the answer will surprise you.  At Oracle, we have understood these challenges and have been working hard to create cloud solutions that are relevant and meaningful for enterprises.  And we have been doing it for much longer than you may think. Oracle was one of the very first enterprise software companies to make our products available on the Amazon Cloud. As far back as in 2007, we created new cloud solutions such as Cloud Database Backup that are helping customers like Amazon save millions every year.  Our cloud solution portfolio is also the broadest and most deep in the industry  - covering public, private, hybrid, Infrastructure, platform and applications clouds. It is no coincidence therefore that the Oracle Cloud today offers the most comprehensive set of public cloud services in the industry.  And to a large part, this has been made possible thanks to our years on investment in creating cloud enabling technologies. I will dedicated the third and final part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain” to Oracle Cloud Technologies Building Blocks and how they mapped into our vision of Enterprise Cloud. Stay Tuned.

    Read the article

  • NFS mount of /var/www to OS X

    - by ploughguy
    I have spent 2 hours trying to create an NFS mount from my Ubuntu 10.04 LTS server to my OS X desktop system. Objective: three way file compare between the code base on the Mac, the development system on the local Linux test system, and the hosted website. The hosted service uses cpanel so I can mount a webdisk - easy as pie - took 10 seconds. The local Ubuntu box, on the other hand - nothing but pain and frustration. Here is what I have tried: In File Browser, navigate to /var/www/site and right-click. Select share this folder. Enter sharename wwwsite and a comment. Click button "Create Share". Message says - you can only share file systems you own. There is a message on how to fix this, but the killer is that this is sharing by SMB. It will change the LFs to CR-LFs which will affect the file comparison. So forget this option. In a terminal window, run shares-admin (I have not been able to convince it to give me the "Shared Folders" option in the System Administration window - Maybe it is somewhere else in the menu, but I cannot find it) define an NFS export. Enter the path /var/www/site, select NFS enter the ip address of the iMac and save. On the mac, try to mount the file system using the usual methods - finder, command line "mount" command - not found. Nothing. Tried restarting the linux box in case there is a daemon that needs restarting - nothing. So I have run out of stuff to do. I have tried searching the documentation - it is pretty basic. The man page documentation is as opaque as ever. Please, oh please, will someone help me to get this @38&@^# thing to work! Thanks for reading this far... PG.

    Read the article

  • How do I reinforce compression options?

    - by Gooberpatrol66
    Shortly after I got my computer, I enabled NTFS compression on it, selecting the option to compress "all files and subfolders". I recently noticed that several folders on my PC are not compressed anymore, including "Program Files" and "Windows". I suspect this happened when I installed Windows 8.1. The problem is, the only way I can think of to fix this would be to uncheck the tick box under "Properties" for my drive, thus decompressing everything on my drive, and then re-check it with the "all files and subfolders" option. Is there a way to compress all the uncompressed folders without first decompressing the compressed folders?

    Read the article

  • Empty MacBook Pro, no SuperDrive. How do I install Windows?

    - by MCcz
    My situation is this: HDD1: Empty 180 GB SSD HDD2: Empty 500 GB HDD (instead of SuperDrive) Accessories: Windows 8 ISO 64 Gig USB stick Second computer SuperDrive in USB enclosure What I need: Install Windows 8 on the SSD in the laptop What I tried: Create bootable USB – Doesn't work. Macbook doesnt show me USB as an option after holding OPTION key. Install Windows through SuperDrive connected via USB – Doesn't work. On the internet, there are thousands of articles telling me all kinds of solutions, expecting me to have Mac OS on my laptop. Is there any solution to this?

    Read the article

  • Software for Company internal Website [closed]

    - by LordT
    hope this is the right stackexchange site to ask this: We've a group of webpages/services at work (SE Startup), ranging from SVN, trac, continous integration to link collections to a DMS. Nearly everything has an RSS Feed to get the info I need, with the exception of SVN. I'm looking for some kind of software that can integrate these well on a kind of start-page. The most recent changes, upcoming events etc should be clearly visible, as well as an option to search (the search will be provided from a different tool). A news area should be included as well. Currently, I'm pondering doing this with either wordpress or TWiki, although wordpress seems to be the simpler solution in terms of getting something good looking quickly. Authentication should be handled by HTTP-Basic Auth, which we already have in place and working well. I normally would consider Sharepoint a viable option for this, but we're exclusively mac and linux, I won't put up a windows server just for this.

    Read the article

  • What is the secure way to isolate ftp server users on unix?

    - by djs
    I've read documentation for various ftp daemons and various long threads about the security implications of using a chroot environment for an ftp server when giving users write access. If you read the vsftpd documentation, in particular, it implies that using chroot_local_user is a security hazard, while not using it is not. There seems to be no coverage of the implications of allowing the user access to the entire filesystem (as permitted by their user and group membership), nor to the confusion this can create. So, I'd like to understand what is the correct method to use in practice. Should an ftp server with authenticated write-access users provide a non-chroot environment, a chroot environment, or some other option? Given that Windows ftp daemons don't have the option to use chroot, they need to implement isolation otherwise. Do any unix ftp daemons do something similar?

    Read the article

  • Where Windows 8 store data of things I recently changed?

    - by user1769787
    Today when I start my computer I see it's not start successfully and show me somthing to Restore like option. I click it and it's rollback many things I have done in my past. suppose. 1.I have uninstalled the software in last days. I have seen them in add or remove option and it's finally work now. (How a software can come back which I have uninstalled). I can see that my email adress which I have removed are come back. My firefox (latest Beta) goes downgrade. I have installed update in last days and now I can see it's goes downgrade. So where they stored all thing which they have rollback.I am sure if they have make a software installed once again which I have already uninstalled then they have stored it somewhere. so Where Win8 store these files which I have restored.

    Read the article

  • "Permission denied (publickey)" error when ssh'ing to Amazon EC2 Debian AMI 05f3ef71

    - by user193537
    I have launched a Debian system using AMI 05f3ef71 in Amazon EC2, but I have no lock connecting to it using SSH as suggested in "Connecting to Your Linux/UNIX Instances Using SSH". I tried several user names: ec2-user, root, debian... None of them worked. I always get a Permission denied (publickey) error message. Using ec2-get-console-output instance_id as suggested doesn't work either, it requires option "-K". If I supply it, I get the error message Required option '-C, --cert CERT' missing, but I have no idea what to supply there. Port 22 is opened on the affected instance. Does anyone have an idea what I could try to log in to my instance?

    Read the article

  • Dual boot :Windows 7 partition deleted after Kubuntu 14.04 install...Weird!

    - by user292152
    I've bought two new SSD's in order to install Kubuntu on one and Win 7 on the other one. Before I had Linux Mint and Win7 together one just one SSD. So first I installed win7 as recommended, and then used the guided installer of Kubuntu to install Kubuntu. I selected the second SSD, chose the option "use entire disk and install", but to my surprise after rebooting and selecting win7 boot loader from grub2, I got a prompt that my windows installation is damaged, and I need to run the repair option from the installation disk. So I booted into Kubuntu again, fired up kparted and saw that indeed my windows partition got deleted, except the recovery partition. I don't understand what happened. I am not new to this topic, and this was not my first time installing Ubuntu alongside windows. I have never ever had that problem. What can I do to make sure this won't happen again, so I won't waste another 2 hours of my life? ?? Thanks a lot !

    Read the article

  • How to configure OpenVPN server to use custom default gateway?

    - by Arenim
    I have a vpn server at address 10.1.0.2 and the server have another ip in it's network -- 10.0.0.2 in his subnet (it's a tun2socks router). But default server's gateway is NOT 10.0.0.2 (and it's ok) but another external IP. I want all the client's traffic to be forwarded through this ip address -- 10.0.0.2. Here is part of my server's config: dev tap0 server-bridge 10.1.0.1 255.255.255.0 10.1.0.50 10.1.0.100 push "route 10.0.0.0 255.255.255.0" ; now client can ping 10.0.0.2 push "redirect-gateway def1 bypass-dhcp" push "dhcp-option DNS 10.1.0.1" push "dhcp-option WINS 10.1.0.1" in fact i want some like push "redirect-gateway 10.0.0.2" How can I achieve this?

    Read the article

  • Hidden Periodic Screenshots on a corporate workstation?

    - by ssxuser80
    Can anyone recommend something that allows us to take hidden periodic screenshots of a workstation? We have a user who we believe is abusing his computer privileges. We have our suspicions that he may be playing games, etc. We need to monitor his screen without him being aware of it. Currently, the IT Department here is using Dameware Mini Remote Control to view his login sessions. But there isn't an option to set up automatic periodic screenshots. I'm hoping to find a tool that has this option and can be centrally managed as well. Thank you for your time. Any help would be greatly appreciated.

    Read the article

  • Sharepoint 2007: Edit vs Read Only Mode

    - by user29116
    Sorry about the title, dont' really know what it should be. If I open a doc in read only mode I'm able to press save and then it opens up a save as box and the default directory is the directory on the sharepoint server and if you press save you save it to the server. This actually makes the whole process not really "read only" mode since I could actually update the document. Is there a way to prevent this from happening so that if someone chooses read only there is no way possible to updload any changes back to the sharepoint site? Also, it has been suggested as a solution to get rid of the edit/read only option so that people have to check out the document. Is there a way to remove the edit/read only option on documents?

    Read the article

  • is there a GOTCHA - DBCC CHECKDB ('DBNAME', NOINDEX)?

    - by Deb Anderson
    I am turning on DBCC CHECKDB in our OLTP environment (SQL 2005,2008). System overhead is a very visible thing on our serversso I want them to be as efficient as it makes sense for them to be. HENCE - I want to turn on the NOINDEX option, an option I've never used before. My thoughts are these: if there is a problem with an index that is detected outside the integrity check, that I can just rebuild the index. Also the duration of the integrity checks will be drastically reduced, and the nastier corruption will be detected. What is the flaw in my plan? Thanks, Deb

    Read the article

  • problem in connecting Reliance booadband +

    - by Athira R
    I tried connceting reliance broadband in ubuntu. then an autorun prompt will come. when i click install another pop up coming saying autorun file not found. And when i checked in Internet connection to add connection. Reliance broadband option iteself is not there as connected device. This is Athira. I could not find any add comment option on my main question. So i created a new account and adding as new answer to ur question. athira@athira-laptop:~$ sudo lsusb Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 004: ID 12d1:1505 Huawei Technologies Co., Ltd. Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 002: ID 08ff:1600 AuthenTec, Inc. AES1600 Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 003 Device 002: ID 0421:0508 Nokia Mobile Phones E65 (PC Suite mode) Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 002: ID 064e:a103 Suyin Corp. Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub athira@athira-laptop:~$ i got this after running it in ubuntu 10.04

    Read the article

  • EC2 Image to start

    - by HD.
    I'm starting to test EC2 for a couple of new projects. I need to choose an AMI (Amazon Machine Image) and Amazon offered me as first option Fedora Core 8, which is a very old version of one of my favorites distributions. There is a lot of choices, but it's not clear for me which one is the better option. I have my own reasons in order to choice a distro and a version when I need to install a new server but I don't know If I can apply the same for EC2. I know there is a beta for RHEL, how stable is this beta?, How can I choose between all the CentOS AMIs in the list? So this is my question: Do you recommend an AMI to start with EC2? Thanks

    Read the article

  • New functionality in TFS Build Manager &ndash; Managing Triggers and Build Resources

    - by Jakob Ehn
    Yesterday we pushed out a new release (August 2012) of the Community TFS Build Extension, including a new version of the Community TFS Build Manager (1.0.4.6) The two big new features in the Build Manager in this release are: Set Triggers It is now possible to select one or more build definitions and update the triggers for them in one simple operation: You’ll note that we have started collapsing the context menu a bit, the list of commands are getting long! When selecting the Trigger command, you’ll see a dialog where the options should be self-explanatory: The only thing missing here is the Scheduled trigger option, you’ll have to do that using Team Explorer for now.   Manage Build Resources The other feature is that it is now possible to view the build controllers and agents in your current collection and also perform some actions against them. The new functionality is available by select the Build Resources item in the drop down menu: Selecting this, you’ll see a (sort of) hierarchical view of the build controllers and their agents: In this view you can quickly see all the resources and their status. You can also view the build directory of each build agent and the tags that are associated with them. On the action menu, you can enable and disable both agents and controllers (several at a time), and you can also select to remove them. By selecting Manage, you’ll be presented with the standard Manage Controller dialog from Visual Studio where you can set the rest of the properties. Hopefully we’ll be able to implement most of the existing functionality so that we can remove that menu option Our plan is to add more functionality to this view, such as adding new agents/controllers, restarting build service hosts, maybe view diagnostic information such as disk space and error logs.   Hope you’ll find the new functionality useful. Remember to log any bugs and feature requests on the CodePlex site. Happy building!

    Read the article

  • How to stop any dialog windows from showing when inserting a USB drive in Windows?

    - by jasondavis
    1) I just found a really interesting program that allows me to use a USB drive as a windows login key. It is called Rohos Logon Key. IF I remobve my USB drive/key from the PC then I can have the PC lock or hibernate or any other option, I have been looking for such a solution for many years but never knew one existed until this and it works much better then I imagined. I do have a couple minor issues though (im using Windows 7 pro). When I remove and then re-insert my USB key, windows prompts me with this dialog here... Generally when I get this I just click on "Continue without scanning" however I am looking for a solution to just make it not even show this at all, is it possible to disable it from showing? 2) I also get this dialog as well when I insert USB drives/key... Would it be possible to not show this as well or have it pick an option by default or anything really?

    Read the article

  • Cronjob terminates early

    - by TheBigO
    In my crontab file I execute a script like so (I edit the crontab using sudo crontab -e): 01 * * * * bash /etc/m/start.sh The script runs some other scripts like so: sudo bash -c "/etc/m/abc.sh --option=1" & sleep 2 sudo bash -c "/etc/m/abc.sh --option=2" & When cron runs the script start.sh, I do ps aux | grep abc.sh and I see the abc.sh script running. After a couple of seconds, the script is no longer running, even though abc.sh should take hours to finish. If I do sudo bash /etc/m/start.sh & from the command line, everything works fine (the abc.sh scripts run for hours in the background until they complete). How do I debug this? Is there something I'm doing that is preventing these scripts from running in the background until they are done?

    Read the article

  • Help trying/downloading ubuntu

    - by koolomwee
    I can't try ubuntu. Every time i put in my disk and press try, it shows the ubuntu sign with the dots under it than it shows a command screen basically blinking saying that it closed and canceled applications belonging to ubuntu... Than I tried to download it from the disk... same thing. Than, because I have windows, I tried to download it from the website and it worked. I rebooted my computer and went into the Ubuntu option just to find a blinking command screen again... How do I get ubuntu to work? What does happen (with the command screen) is that it shows commands that are blinking to fast for me to read the whole thing, and then it stops after like a minute or two of blinking. The only thing I do get to read is that Ubuntu commanded some applications to shut down. In the CD one it only has like 5 error messages, and when I reboot and select Ubuntu, there are over a hundred error messages. With the live-cd I have to shut down my computer to use it again. With the reboot and clicking ubuntu option it reboots by itself. I expect it to actually start up with no error messages. UPDATE I've been trying to start Ubuntu for the last couple of days, and I noticed on my last try that it said "SIGNAL 15 RECEIVED"... It also said that it's stopping Bluetooth and all other programs, and that its rebooting Maybe that'll help a little with answering my question... thanks :) this also might help: computer brand/model: HP Windows 7 Home Premium Service Pack 1, HPE240f graphics card: ATI Radeon HD 5570 I also wrote the same question here: https://answers.launchpad.net/wubi/+question/194537 might help a little more with information

    Read the article

  • Windows Recovery partition unusable after Ubuntu 12.04 install on Eee PC 1005P

    - by Crivat Camilar
    Installed Ubuntu 12.04 over the secondary (D:) partition with Grub2 handling multi-boot. Never accessed the 'Recovery' option in the boot menu until Windows7 Starter became unusable due to HDD failure (bad sectors on C:). Tried creating an USB recovery stick using the OEM's recovery application (F9) on hidden partition: all I got was a clean C:\ and an error telling me the recovery images cannot be found [R:\recovery\windowsre\ - or something very much like that] although everything is there (changed 'hidden' flag to check and copy contents). Nothing happens upon pressing F9, then Grub takes over giving the recovery option. The application starts but halts about 30s after initializing, very briefly displaying the error message above. I guess every time it goes through this it actually wipes C:\ but crashes immediately afterwards not being able to find what-ever .wim image files it needs. How to make it work?

    Read the article

  • There's Not an App for That (Yet)

    - by Mark Hesse
    With an earlier-than-normal departure this morning to avoid the stalemate known as traffic congestion, I suddenly realized what I had failed to grab on my way out the door...  my company ID badge.  Unfortunately, at the time of my epiphany, I was far enough into commuter no-man's land where turning back would completely negate my early departure and increase my overall drive time exponentially.  Not being one to retrace my steps, I decided to press on. Upon arrival at the office and with an hour to go before a security guard would be on duty, I started thinking about the number of times I had forgotten my ID vs. the number of times I had forgotten my phone.  While rare on both accounts, my ID was most likely the missing artifact. I then wondered why there isn't an app for my smartphone that allows me to verify my credentials with my employer and then, provided with a secure token for the day, have the ability to access my building's card entry system.  On many levels, this seems much more secure than an ID card which can be lost, stolen or even forged and then used simply by tailgating into and around buildings at facilities where card scanning can generally be avoided.   As it turns out, another building on the campus has 24 x 7 guard coverage, so I was able to gain access in a relatively short time and secure a temporary ID badge.  Once inside and online, a quick internet search on the subject of smartphone badge access shows that efforts are underway to do exactly what I was thinking needed to be done. Having not spent any time studying about the technology, I discovered that it relies on Near Field Communications (NFC) enabled smartphones (of which, mine does not provide).  The only other option would require modifications to the security infrastructure to support alternative authentication technologies, such as barcode readers, which would be extremely costly to implement. For now, my best option is to put my corporate ID under my car keys... 

    Read the article

  • Recommend a web file sharing software please

    - by Baczek
    I'm looking for a web platform to put company files at. My requirements are: should be accessible via a browser should be open source must be installable (dropbox is a no-go) must have an option to put a access time limit on a file must perform garbage collection automatically after a file expires must be able to mark files as public or private an option to protect a file via a pin-code for users without accounts in the system would be nice to have The problem is I don't even know what to search for - all my googling results in either complete groupware solutions or p2p file sharing software. If such a thing doesn't exist, please don't hestitate to say so, so I can crawl to a corner and cry myself to sleep. TIA

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >