Search Results

Search found 36019 results on 1441 pages for 'access 2003'.

Page 493/1441 | < Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >

  • copy & paste in VirtualBox remote console when running headless

    - by katsumii
    One can run VirtualBox guest in "headless" mode and access it using Remote Destkop Protocol(RDP) client.This is typical when VBox server is installed on Linux/Solaris where X-window stuff is not installed and users useWindows to access the VM.So, one can install OS into VBox guests using Remote Desktop client.(e.g. mstsc.exe)Here's the setting. One lesser known feature here is that you can copy&paste into and out-of VM guest and your client.Apparently, "VMware Workstation" still doesn't support it. VMware Workstation Documentation CenterYou cannot copy and paste text between the host system and the guest operating system   

    Read the article

  • Is it possible to get a Proxy Authentication Dialog with Ubuntu Server?

    - by Johnny Bigoode
    I've got a VM Virtual Box with Ubuntu Server. I'set the http_proxy variable using export http_proxy="http://1234:linux@proxy:8080" The problem is that Ubuntu will constantly try to connect to the internet, even when I'm not logged in my company's account, so everyday I need to reset my password since Ubuntu will constantly try to access the internet. Also, it's always a problem when I need to authenticate the proxy with a different user/password. Can't I just set it to make a small prompt when it tries to connect to the proxy and fails? Like Firefox, Chrome and every app I have installed with Windows 7? I get this small dialog box that asks for a username and password when it can't access the internet. The Ubuntu Server doesn't need constant internet connection, specially since it's only online for tests over LAN.

    Read the article

  • UK Oracle User Group Event: Trends in Identity Management

    - by B Shashikumar
    As threat levels rise and new technologies such as cloud and mobile computing gain widespread acceptance, security is occupying more and more mindshare among IT executives. To help prepare for the rapidly changing security landscape, the Oracle UK User Group community and our partners at Enline/SENA have put together an User Group event in London on Apr 19 where you can learn more from your industry peers about upcoming trends in identity management. Here are some of the key trends in identity management and security that we predicted at the beginning of last year and look how they have turned out so far. You have to admit that we have a pretty good track record when it comes to forecasting trends in identity management and security. Threat levels will grow—and there will be more serious breaches:   We have since witnessed breaches of high value targets like RSA and Epsilon. Most organizations have not done enough to protect against insider threats. Organizations need to look for security solutions to stop user access to applications based on real-time patterns of fraud and for situations in which employees change roles or employment status within a company. Cloud computing will continue to grow—and require new security solutions: Cloud computing has since exploded into a dominant secular trend in the industry. Cloud computing continues to present many opportunities like low upfront costs, rapid deployment etc. But Cloud computing also increases policy fragmentation and reduces visibility and control. So organizations require solutions that bridge the security gap between the enterprise and cloud applications to reduce fragmentation and increase control. Mobile devices will challenge traditional security solutions: Since that time, we have witnessed proliferation of mobile devices—combined with increasing numbers of employees bringing their own devices to work (BYOD) — these trends continue to dissolve the traditional boundaries of the enterprise. This in turn, requires a holistic approach within an organization that combines strong authentication and fraud protection, externalization of entitlements, and centralized management across multiple applications—and open standards to make all that possible.  Security platforms will continue to converge: As organizations move increasingly toward vendor consolidation, security solutions are also evolving. Next-generation identity management platforms have best-of-breed features, and must also remain open and flexible to remain viable. As a result, developers need products such as the Oracle Access Management Suite in order to efficiently and reliably build identity and access management into applications—without requiring security experts. Organizations will increasingly pursue "business-centric compliance.": Privacy and security regulations have continued to increase. So businesses are increasingly look for solutions that combine strong security and compliance management tools with business ready experience for faster, lower-cost implementations.  If you'd like to hear more about the top trends in identity management and learn how to empower yourself, then join us for the Oracle UK User Group on Thu Apr 19 in London where Oracle and Enline/SENA product experts will come together to share security trends, best practices, and solutions for your business. Register Here.

    Read the article

  • Unable to Sign in to the Microsoft Online Services Signin application from Windows 7 client located behind ISA firewall

    - by Ravindra Pamidi
    A while ago i helped a customer troubleshoot authentication problem with Microsoft Online Services Signin application.  This customer was evaluating Microsoft BPOS (Business Productivity Online Services) and was having trouble using the single sign on application behind ISA 2004 firewall.The network structure is fairly simple with single Windows 2003 Active Directory domain and Windows 7 clients. On a successful logon to the Microsoft Online Services Signin application, this application provides single signon functionality to all of Microsoft online services in the BPOS package. Symptoms:When trying to signin it fails with error "The service is currently unavailable. Please try again later. If problems continue, contact your service administrator". If ISA 2004 firewall is removed from the picture the authentication succeeds.Troubleshooting: Enabled ISA Server firewall logging along with Microsoft Network Monitor tool on the Windows 7 Client while reproducing the issue. Analysis of the ISA Server Firewall logs and Microsoft Network capture revealed that the Microsoft Online Services Sign In application when sending request to ISA Server does not send the domain credentials and as a result ISA Server responds with an error code of HTTP 407 Proxy authentication required listing out the supported authentication mechanisms.  The application in question is expected to send the credentials of the domain user in response to this request. However in this case, it fails to send the logged on user's domain credentials. Bit of researching on the Internet revealed that The "Microsoft Online Services Sign In" application by default does not support Outbound Internet Proxy authentication. In order for it to send the logged on user's domain credentials we had to make  changes to its configuration file "SignIn.exe.config" located under "Program Files\Microsoft Online Services\Sign In" folder. Step by Step details to configure the configuration file are documented on Microsoft TechNet website given below.  Configure your outbound authenticating proxy serverhttp://www.microsoft.com/online/help/en-us/helphowto/cc54100d-d149-45a9-8e96-f248ecb1b596.htm After the above problem was addressed we were still not able to use the "Microsoft Online Services Sign In" application and it failed with the same error.  Analysis of another network capture revealed that the application in question is now sending the required credentials and the connection seems to terminate at a later stage. Enabled verbose logging for the "Microsoft Online Services Sign In" application and then reproduced the problem. Analysis of the logs revealed a time difference between the local client and Microsoft Online services server of around seven minutes which is above the acceptable time skew of five minutes. Excerpt from Microsoft Online Services Sign In application verbose log:  1/26/2012 1:57:51 PM Verbose SingleSignOn.GetSSOGenericInterface SSO Interface URL: https://signinservice.apac.microsoftonline.com/ssoservice/UID1/26/2012 1:57:52 PM Exception SSOSignIn.SignIn The security timestamp is invalid because its creation time ('2012-01-26T08:34:52.767Z') is in the future. Current time is '2012-01-26T08:27:52.987Z' and allowed clock skew is '00:05:00'.1/26/2012 1:57:52 PM Exception SSOSignIn.SignIn  Although the Windows 7 Clients successfully synchronized time to the domain controller for the domain, the domain controller was not configured to synchronize time with external NTP servers. This caused a gradual drift in time on the network thus resulting in the above issue. Reconfigured the domain controller holding the PDC FSMO role to synchronize time with external time source ( time.nist.gov ) and edited the system policy on the ISA server firewall to allow NTP traffic to time.nist.gov Configure the time source for the forest:Windows Time Servicehttp://technet.microsoft.com/en-us/library/cc794937(WS.10).aspx Forced synchronization of Windows time using the command w32tm /resync on the domain controller and later on the clients each of which had corrected the seven minutes difference. This resolved the problem with logon to Microsoft Online Services Sign In.

    Read the article

  • Wammu, Samsung J700 error GetNextMemory code: 56

    - by Tamas
    I have got a (old) Samsung J700i. When connecting with a USB cabel to Wammu first the access was denied. Now it is oké... However, when I try to get info out from the phone... I get error message: Error whlie communicating with phone Desciption: Internal phone error. Function: GetNextMemory Error code: 56 I am using Ubuntu 12.04 and Wammu 0.36 Running on Python 2.7.3 Using wxPython 2.8.12.1 Using python-gammu 1.31.0 and Gammu 1.31.0 How may I access data on the phone? Thanks, Tamas

    Read the article

  • Internet stopped working after aircrack install (can still connect to my router). Suggestions?

    - by Dan
    I'm sure I did something dumb during the install for aircrack, but I have no clue what. Like I said, I can connect to my router (not trying to crack, simply logging into it), I just can't get internet access. To make this more interesting, the machine is dual booted and when I log into Windows I have zero issues. I feel like some driver may have been messed up in the process, but I'm not sure if that's a reasonable assumption or how exactly to check that. Also, can't revert to backup as there are none. Suggestions on how to trouble-shoot would be appreciated. EDIT: Hard-wiring to the wall I can get internet access on the Ubuntu side... PS- I know this is a stretch as far as fitting the Ubuntu section, but no other stackoverflow sites seemed to fit better.

    Read the article

  • Wifi won't work without ethernet (Ubuntu 12.04)

    - by alok
    I have a strange problem. I am running Ubuntu 12.04 on a Dell Studio 1558 laptop. I'm usually unable to access Wifi (I have a BCM43224 wireless card with the STA proprietary driver installed.) So I have to unplug the ethernet cable from my Wifi router and stick it into my laptop's ethernet port to access the Internet. But when I stick the cable back into the Wifi router and try connecting to Wifi, Wifi inexplicably starts working. How can I get Wifi to work all the time without having to 'jumpstart' it with an ethernet connection?

    Read the article

  • Better way to set up samba bridge?

    - by Adam Butler
    I have an old ubuntu laptop hooked up to between my wireless network and a wired media player box. I had previously shared my wireless network connection so the media player had internet access (ie. via nat) because it was a different subnet it could not access the file shares on the wireless network. To get around this I mounted the drives from the wireless network on the laptop and re-shared them with samba. This worked ok but had some drawbacks, it seemed slow and if network computers were turned off when the laptop rebooted I had to manually mount the shares. I've just re-installed with the latest ubuntu and was wondering if there is a better way to do this. Is there some way to bridge so the media player appears to be on the wireless network? Would this give better performance? Any other options? I'm also thinking there might be some samba options that could buffer files?

    Read the article

  • The inevitable Hello World post!

    - by brendonpage
    Greetings to anyone reading this! This is my first of hopefully many posts. I would like to use this post to introduce myself and to let you know what to expect from this blog in future. Okay so a bit about myself. In case you missed the name of this blog, my name is Brendon Page! I am a Software Developer from South Africa and work for a small company who’s main focus is producing software for the kitchen cupboard industry, although from time to time we do produce custom solutions for other industries. I work in a small team of 3, including myself, and am fortunate enough to work from home! I have been involved in IT since 1996, which is when I got my first PC, and started working as a junior programmer in 2003. Outside of work I enjoy playing squash, PC Games and of course LANing with my friends. If I get any free time between all of that I will usually dedicate some of it to a personal project, these are mainly prototypes for an idea I have had or for something that could be useful at work. I was in 2 minds on whether to include a photo of myself. The reason for this was because while I was looking for a suitable photo to use, it dawned on me how much time I dedicate to pulling funny faces in photos! I also realized how little I shave, which I blame completely on working form home. So after much debate here I am, funny face, beard and all!   Now that you know a bit about me lets move onto what expect from this blog. I work predominantly with Microsoft technologies so most if not all of my posts will be related to something Microsoft. Since most of my job entails Software Development you can expect a lot of posts which will deal with the .NET Framework. I am currently working on a large Silverlight project, so my first few posts will be targeted at in that direction. I will be striving to make the content of my posts as useful as possible from both an explanation and code perspective, I aim to include a working solution for every post, which I will put up on my skydrive for download. Here is what I have planned for my next few posts: Where did my session variables go?  Here I will take you through the lessons I learnt the hard way about the ASP.NET session. I am not going to go into to much depth in this post, as there is already a lot of information available on it. I mainly want to cover it in an effort to keep the scope creep of my posts to a minimum, some the solutions I upload will use it and I would like to have a post that I can reference to explain why I am doing something a certain way. Uploading files through SIlverlight Again there is a lot of existing information on this topic, so I wont be going into to much depth, but I will be using the solution from this as a base for my next post. Generating and Displaying DeepZoom images dynamically in Silverlight Well the title pretty much speaks for it’s self on this one. As I mentioned I will be building off the solution that I create in my ‘Uploading files through Silverlight’ post. Securing DeepZoom images using a custom implementation of the MultiScaleTileSource In this post I will look at the privacy issue surrounding the default usage of DeepZoom images in Silverlight and how to overcome it. This makes the use of DeepZoom in privacy conscious applications more viable. Thanks to anyone who actually read this post! I look forward to producing more which will hopefully be helpful to you.

    Read the article

  • Accessing SQL Server data from iOS apps

    - by RobertChipperfield
    Almost all mobile apps need access to external data to be valuable. With a huge amount of existing business data residing in Microsoft SQL Server databases, and an ever-increasing drive to make more and more available to mobile users, how do you marry the rather separate worlds of Microsoft's SQL Server and Apple's iOS devices? The classic answer: write a web service layer Look at any of the questions on this topic asked in Internet discussion forums, and you'll inevitably see the answer, "just write a web service and use that!". But what does this process gain? For a well-designed database with a solid security model, and business logic in the database, writing a custom web service on top of this just to access some of the data from a different platform seems inefficient and unnecessary. Desktop applications interact with the SQL Server directly - why should mobile apps be any different? The better answer: the iSql SDK Working along the lines of "if you do something more than once, make it shared," we set about coming up with a better solution for the general case. And so the iSql SDK was born: sitting between SQL Server and your iOS apps, it provides the simple API you're used to if you've been developing desktop apps using the Microsoft SQL Native Client. It turns out a web service remained a sensible idea: HTTP is much more suited to the Big Bad Internet than SQL Server's native TDS protocol, removing the need for complex configuration, firewall configuration, and the like. However, rather than writing a web service for every app that needs data access, we made the web service generic, serving only as a proxy between the SQL Server and a client library integrated into the iPhone or iPad app. This client library handles all the network communication, and provides a clean API. OSQL in 25 lines of code As an example of how to use the API, I put together a very simple app that allowed the user to enter one or more SQL statements, and displayed the results in a rather primitively formatted text field. The total amount of Objective-C code responsible for doing the work? About 25 lines. You can see this in action in the demo video. Beta out now - your chance to give us your suggestions! We've released the iSql SDK as a beta on the MobileFoo website: you're welcome to download a copy, have a play in your own apps, and let us know what we've missed using the Feedback button on the site. Software development should be fun and rewarding: no-one wants to spend their time writing boiler-plate code over and over again, so stop writing the same web service code, and start doing exciting things in the new world of mobile data!

    Read the article

  • Security Issues When Creating Pages in SharePoint

    - by Damon
    I was speaking (or rather IM'ing) with Ben Collins a while back and he came across an interesting problem that I wanted to document for the sake of posterity.  If you have a SharePoint user who has permissions to create a page in a page library, but that user is having security issues trying to actually make a page, then it the security issue may be related to their access rights on the master page gallery.  Users who create pages must have at least restricted read access to the master page gallery for page creation to succeed. That is one of the joys of working in SharePoint. if something doesn't show up there is usually a good but obscure reason for it, but SharePoint certainly won't tell you outright why it is.  All I have to say is that I'm glad he ran into that issue and not me.

    Read the article

  • How to Sync Files Between Computers Without Storing Them in the Cloud

    - by Chris Hoffman
    So you have multiple computers and you want to keep your files in sync, but you don’t want to store them on someone else’s servers. You’ll want a service that synchronizes files directly between your computers. With such a service, you can synchronize an unlimited amount of files and people can’t gain access to your files just by gaining access to an account on a server and viewing the files via the web interface. We’re focused on syncing files over the network here — either over a local network or the Internet. We’re looking for Dropbox-style solutions that don’t store files on a central server like Dropbox does.    

    Read the article

  • How do I disable changing proxy settings?

    - by gap
    I've got several machines, running 14.04.1, for kids to access. Each machine has accounts for each kid, as well accounts for several adults. Although I've set a system wide proxy use policy, it doesn't get used by Chrome. Instead, I had to log into each kids account and set a proxy use policy, pointing to tinyproxy/dansguardian, for safe internet access. The problem is that, should the kids get an ounce of computer savvy, they'll figure out that they can launch the proxy settings config panel from Unity, then change the proxy settings to None, and completely bypass the "safe" internet scheme I've setup. Can anyone tell me how I can disable those user accounts from being able to modify their proxy settings (not the "system wide" ones... those arent used.. this is the per-user settings). Thanks

    Read the article

  • Web Usage Policies at "Best Companies to Work For"

    - by Greg
    For any company that has made it onto a "Best Companies to Work For" list* in 2010 that hires programmers, what restrictions on web usage do they have in place? Does that company view unrestricted internet access as a benefit to their employees even if they might use it for non-work related reasons? If you can provide a link to back up this information, even better. (* I don't want to promote any specific publication, but they are easy enough to find.) My hypothesis is that a company that aspires to be one of the best needs to provide fairly unrestricted internet access and I'm trying to test it.

    Read the article

  • How can I determine which GPU card is running at PCI Express 2.0 x16 & which is using x8?

    - by M. Tibbits
    Is there a way to determine the speed of the PCI Express connection to a specific card? I have three cards plugged in: two Nvidia GTX 480's (one at x16 & and one at x8) one Nvidia GTX 460 running at x8 Is there some way, either by a function call in C or an option to lspci that I can determine the bus speed of the graphics cards? When I only use one of the cards for my CUDA program, I'd like to use the one which is running at x16. Thanks! Note: lspci -vvv dumps out For the two GTX 480s. I don't see any differences that pertain to bus speed. 03:00.0 VGA compatible controller: nVidia Corporation Device 06c0 (rev a3) Subsystem: eVga.com. Corp. Device 1480 Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Latency: 0 Interrupt: pin A routed to IRQ 16 Region 0: Memory at d4000000 (32-bit, non-prefetchable) [size=32M] Region 1: Memory at b0000000 (64-bit, prefetchable) [size=128M] Region 3: Memory at bc000000 (64-bit, prefetchable) [size=64M] Region 5: I/O ports at df00 [disabled] [size=128] [virtual] Expansion ROM at b8000000 [disabled] [size=512K] Capabilities: <access denied> Kernel driver in use: nvidia Kernel modules: nvidia, nvidiafb, nouveau 03:00.1 Audio device: nVidia Corporation Device 0be5 (rev a1) Subsystem: eVga.com. Corp. Device 1480 Control: I/O- Mem- BusMaster- SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Interrupt: pin B routed to IRQ 5 Region 0: [virtual] Memory at d7ffc000 (32-bit, non-prefetchable) [disabled] [size=16K] Capabilities: <access denied> 04:00.0 VGA compatible controller: nVidia Corporation Device 06c0 (rev a3) Subsystem: eVga.com. Corp. Device 1480 Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Latency: 0 Interrupt: pin A routed to IRQ 16 Region 0: Memory at dc000000 (32-bit, non-prefetchable) [size=32M] Region 1: Memory at c0000000 (64-bit, prefetchable) [size=128M] Region 3: Memory at cc000000 (64-bit, prefetchable) [size=64M] Region 5: I/O ports at cf00 [size=128] [virtual] Expansion ROM at c8000000 [disabled] [size=512K] Capabilities: <access denied> Kernel driver in use: nvidia Kernel modules: nvidia, nvidiafb, nouveau 04:00.1 Audio device: nVidia Corporation Device 0be5 (rev a1) Subsystem: eVga.com. Corp. Device 1480 Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Latency: 0, Cache Line Size: 64 bytes Interrupt: pin B routed to IRQ 5 Region 0: Memory at dfffc000 (32-bit, non-prefetchable) [size=16K] Capabilities: <access denied> And the only differences I see relate specifically to the memory mapping: myComputer:~> diff card1 card2 3c3 < Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- --- > Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- 7,11c7,11 < Region 0: Memory at d4000000 (32-bit, non-prefetchable) [size=32M] < Region 1: Memory at b0000000 (64-bit, prefetchable) [size=128M] < Region 3: Memory at bc000000 (64-bit, prefetchable) [size=64M] < Region 5: I/O ports at df00 [disabled] [size=128] < [virtual] Expansion ROM at b8000000 [disabled] [size=512K] --- > Region 0: Memory at dc000000 (32-bit, non-prefetchable) [size=32M] > Region 1: Memory at c0000000 (64-bit, prefetchable) [size=128M] > Region 3: Memory at cc000000 (64-bit, prefetchable) [size=64M] > Region 5: I/O ports at cf00 [size=128] > [virtual] Expansion ROM at c8000000 [disabled] [size=512K] 18c18 < Control: I/O- Mem- BusMaster- SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- --- > Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- 19a20 > Latency: 0, Cache Line Size: 64 bytes 21c22 < Region 0: [virtual] Memory at d7ffc000 (32-bit, non-prefetchable) [disabled] [size=16K] --- > Region 0: Memory at dfffc000 (32-bit, non-prefetchable) [size=16K]

    Read the article

  • How to drag from a background window to the front window

    - by Luis Alvarado
    Is the following I will explain possible with a key combination? Here is the image: As you can see, the terminal is the focus window (Front window) and Nautilus is in the background (Back Window). How can I grab a folder or file from Nautilus without loosing focus on the terminal (Without making the terminal go to the background and Nautilus to the front) and drop it in the terminal?. What I want is not to have to ALT+TAB again just to do this. Options like resizing the windows to fit the screen are not what I am looking for. Like in the image, we have a fullscreen window that we want it to stay like that. We can drag the terminal window around but anytime I access the background nautilus window, I should not loose focus on the terminal (It should not go to the background every time I access Nautilus). Maybe like a key combination that freezes the current focus windows positions and I can drag from background windows to background windows or background windows to the front focused one.

    Read the article

  • How to make custom libraries accessible?

    - by Milen Bilyanov
    I am trying to compile and install every custom module under it's own designated folder. (ex: /myApps/myLinux/compiled_app) I had luck with python so far, where my python is compiled from source and lives in: /myApps/myLinux/python2.5 and "python2.5" - /myApps/myLinux/python2.5.6-gcc463 so I can access this python through a wrapper script that sets the right environment. The question is recently I had to compile and add something called gperf3.0.4. So now it lives: /myApps/myLinux/gperf3.0 and "gperf3.0" - /myApps/myLinux/gperf3.0.4-gcc463 The question is: How will I point to this lib if some other app needs to access it? Is it done through the LD_LIBRARY_PATH variable? Thanks.

    Read the article

  • Is this Anti-Scraping technique viable with Crawl-Delay?

    - by skibulk
    I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute. I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold. Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

    Read the article

  • Size doesn't matter

    - by ssoolsma
    Whenever I start a new project I *always* break up my code in different projects. Also known as n-tier solution. The scale of  the project doesn't matter, but make sure that each project is responsible for himself (or herself if you prefer). I make sure that i ....At least thought about how the project should work on the toilet or in a project team meeting.Have a solution directory and create my projects within. I like to name my project (and it's folders by the namespaces). For instance: When i'm creating a piece of (web)software called: ChuckNorris, i always include the software name in my projects. Start off with designing the DataAccess project. I name it: ChuckNorris.DataAccess which lets me easily identify the project incase the project scales alot.Build the classes which represent the database structure. Don't stop working on a class untill it's finished for now. Also, don't over-do the methods. Build stuff only when it's needed, and not think: "Hm, that would be cool to have". Cause most of the time you end up with unused code, and we don't want that.Build a unittest project and make sure you create the folder inside the project that it's testing. So, create the ChuckNorris.DataAccess.UnitTest project inside the folder of the dataaccess project. I would suggest using the nUnit testframework.Incase you though, hm i skip unittest: Don't! Just build it - it will safe you alot of time later onNow, read 5 again. Build that bloody unittest. Don't skip. (i cant emphasize this enough)Now, every class in the dataaccess project is responsible for itself. They don't rely on each other. This is where we use the BusinessLogic project for. Start creating the ChuckNorris.BusinessLogic project. (not inside the data-access project ofcourse, but withing the ChuckNorris folder.Combine stuff from data-access. This usual involves alot of copying the data-access classes and feels silly at first. (we'll get to that later on)Now you come up to a point of creating a service project. You might not always see why to use it, but see it as a way to expose your businesslogic to any application (including your own). Sometimes i use it as a so-called "Factory". Every call goes through this factory, so that's the only thing i'm exposing to any program, and make sure that those methods are the only ones that I allow you to invoke.Build any UI (website, phoneapp, forms application, silverlight, wpf or whatever) and reference it to you service project. Fall in love (cough) with this approach.It's possible that it doesn't seem to make much sense, and very incomplete. Well, that last part is correct. Next post will go in to detail of setting up your Data-Access project and use the entity framework.

    Read the article

  • Mount to /dev/sdb1 without password

    - by Jarmo
    I am unable to mount a USB drive (or SD card) to my system without root access. When I plug in a USB drive, it is visible in the left column of Nautilus, but when I click on it to open it, I receive the error message Unable to mount 2.1 GB Filesystem Error mounting: mount exited with exit code 1: helper failed with: mount: only root can mount /dev/sdb1 on /media/sdb1 I am able to mount the drive using sudo mount -w /dev/sdb1, but this causes problems for operations such as creating startup discs, which requires unmounting and remounting the drive. I suspect this problem may be caused by the fact that when I upgraded from 11.10 to 12.04, I had an SD card plugged in. This caused the system to stall during later startups, as it could not find this drive. I remedied this by editing a line of /etc/fstab to read /dev/sdb1 /media/sdb1 vfat noauto 0 0 However, I am dual booting Ubuntu with Windows XP, and I have no problem mounting the C: drive of the Windows system without root access, so I feel that this is a problem related to the mount point rather than mounting in general.

    Read the article

  • My wireless/WiFi connection does not work. What can I do?

    - by Wild Man
    Your situation You have successfully installed Ubuntu. You have just downloaded and booted Ubuntu live media. The latest LTS (see also HWE) or latest non-LTS release are preferred. See the list of Ubuntu releases that are currently supported.) You upgraded your Ubuntu installation to the latest release that the software updater offered you. WiFi worked before, but not now on the new release. You migrated your existing Ubuntu installation to new hardware. Your problem The wireless of your laptop or dekstop is not working. You tried switching the wireless switch off and on and you tried rebooting several times, but you don't see any WiFi access points. You can see your wireless access point, but you cannot establish a connection. You want to analyze the problem, but you don't know where to start or what information you can provide. Related questions I have a hardware detection problem, what logs do I need to look into?

    Read the article

  • Wireless Connection Troubles

    - by James
    I just recently switched from Windows 7 over to Ubuntu 12.04 and have been experiencing some issues connecting to my home's wireless network. The only way I can get it to connect to the network is by disabling IPv4 and IPv6 settings. Even then while it says its connected to the network (3 bars), I'm unable to access the Internet. It connected for a little while after I first installed Ubuntu, but after the first reboot I haven't been able to access the web at all. I have very basic knowledge when it comes to computers and barely any when dealing with Ubuntu and Linux. I'm very happy with Ubuntu apart from this one issue, as before my computer was overheating and crashing, I've yet to experience any of those problems since installing Ubuntu. The information I can give may be very limited since I'm having to use my cell phone to figure out the solution to this. Any help would be greatly appreciated. Thanks in advance!

    Read the article

< Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >