Search Results

Search found 7212 results on 289 pages for 'cumulative updates'.

Page 175/289 | < Previous Page | 171 172 173 174 175 176 177 178 179 180 181 182  | Next Page >

  • Acer Aspire 5542G overheating with ubuntu/kubuntu 12.04

    - by james
    I have an Acer Aspire 5542G laptop purchased couple of years ago. All these days, i used windows 7 on it . Then I tried ubuntu 12.04 . Everything was fine except the overheating issue. I updated ubuntu with all security fixes and available updates but nothing solved the problem. With idle use like internet browsing, the cpu fan speeds up a lot and i can feel very hot air coming from the vent (comparable to playing serious 3d game in windows). But it will not go to a point of freeze and shutdown. But as long as im using it, with no intensive tasks at all, the laptop stays too hot. This wasn't the case with windows7. In windows 7 the fan will not rotate at all with normal use. I heard there was manufacturing defect with some acer laptops, but i think it wasn't the case with my laptop since windows7 runs perfectly. I updated the bios to latest version. I cleaned dust in the vents. I tried kubuntu 12.04 up-to-date. Nothing solved the issue. My laptop specs are: CPU : AMD turion2 x2 M500 @ 2.2GHz GPU : AMD Mobility Radeon HD4570 3GB RAM and 320GB hard disk.

    Read the article

  • Internet unusably slow with Realtek Semiconductor Co., Ltd. RTL8111/8168B card

    - by user42424
    So I have recently installed Ubuntu 11.10 for a dual boot with wind 7. After the install I had like 300 updates, so I installed them. At first I could use the internet, although it was extremely slow. However now I cannot, sometimes it will load and others it will simply time out. When I try to download something it will either take forever or will not at all. This is a wired system. On Windows side my speeds are fine. Any help would be greatly appreciated. Also like I said I am new to Linux/Ubuntu so please be nice. One last thing, I also installed 11.10 for same dual boot on my laptop, and wireless speed is the same as on Windows? Only the wired desktop gives me the problem? Hear is some hardware info.. Hope it helps. Mobo: Gigabyte GA=880GMA- AMD / CPU: AMD Phenom (tm) IIx4 965 / 16 GB Ram / Realtek PCIe GBE Family Controller / Cisco Linksys E2000 / Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 06) / eth0 Link encap:Ethernet HWaddr 50:e5:49:33:64:cf inet addr:192.168.1.118 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::52e5:49ff:fe33:64cf/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:76722 errors:0 dropped:76722 overruns:0 frame:76722 TX packets:49692 errors:0 dropped:65 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:107956638 (107.9 MB) TX bytes:4342553 (4.3 MB) Interrupt:44 Base address:0x2000 thanks to roadmr problem solved! I powered down PC, un plugged power from pc end, waited a few (maybe 3)minutes. plugged power back in, pushed and held power button for 30 + seconds. Let go, powered on PC, and my Internet is fine! downloads and web speed blaze, just like on my Win 7 boot, maybe even faster. Problem Solved, Thanks to all!! **

    Read the article

  • Which approach is the most maintainable?

    - by 2rs2ts
    When creating a product which will inherently suffer from regression due to OS updates, which of these is the preferable approach when trying to reduce maintenance cost and the likelihood of needing refactoring, when considering the task of interpreting system state and settings for a lay user? Delegate the responsibility of interpreting the results of inspecting the system to the modules which perform these tasks, or, Separate the concerns of interpretation and inspection into two modules? The first obviously creates a blob in which a lot of code would be verbose, redundant, and hard to grok; the second creates a strong coupling in which the interpretation module essentially has to know what it expects from inspection routines and will have to adapt to changes to the OS just as much as the inspection will. I would normally choose the second option for the separation of concerns, foreseeing the possibility that inspection routines could be re-used, but a developer updating the product to deal with a new OS feature or something would have to not only write an inspection routine but also write an interpretation routine and link the two correctly - and it gets worse for a developer who has to change which inspection routines are used to get a certain system setting, or worse yet, has to fix an inspection routine which broke after an OS patch. I wonder, is it better to have to patch one package a lot or two packages, each somewhat less so?

    Read the article

  • Simple Interactive Search with jQuery and ASP.Net MVC

    - by Doug Lampe
    Google now has a feature where the search updates as you type in the search box.  You can implement the same feature in your MVC site with a little jQuery and MVC Ajax.  Here's how: Create a javascript global variable to hold the previous value of your search box. Use setTimeout to check to see if the search box has changed after some interval.  (Note: Don't use setInterval since we don't want to have to turn the timer off while Ajax is processing.) Submit the form if the value is changed. Set the update target to display your results. Set the on success callback to "start" the timer again.  This, along with step 2 above will make sure that you don't sent multipe requests until the initial request has finished processing. Here is the code: <script type="text/javascript"> var searchValue = $('#Search').val(); $(function () {     setTimeout(checkSearchChanged, 0.1); }); function checkSearchChanged() {     var currentValue = $('#Search').val();     if ((currentValue) && currentValue != searchValue && currentValue != '') {         searchValue = $('#Search').val();         $('#submit').click();     }     else {         setTimeout(checkSearchChanged, 0.1);     } } </script> <h2>Search</h2> <% using (Ajax.BeginForm("SearchResults", new AjaxOptions { UpdateTargetId = "searchResults", OnSuccess = "checkSearchChanged" })) { %>     Search: <%   = Html.TextBox("Search", null, new { @class = "wide" })%><input id="submit" type="submit" value="Search" /> <% } %> <div id="searchResults"></div> That's it!

    Read the article

  • How can I get my monitor's maximum resolution without the proprietary AMD graphic driver installed?

    - by Venki
    I am using Ubuntu 14.04. I have an AMD Radeon 5570 HD graphic card. Actually, the default open source REDWOOD drivers aren't allowing me to choose my monitor's maximum screen resolution(which is 1366 x 768). I just have two resolutions displayed which are 1024x768 and 800x600 . If I give the command : xrandr -s 1366x768 then the output is: Size 1366x768 not found in available modes So just for the sake of getting 1366x768 resolution I am forced to install the proprietary graphic driver that AMD gives me from its site. But if I install it(which itself is quite a problem-prone process), I undergo a lot of 'inconvenience'. Sometimes after an OS update, the driver crashes unity. Then I will have to uninstall that driver from a tty and google around for a solution. Also I encounter screen tearing problems occasionally. In addition I also cant see my login screen(See this question which states this particular problem). The main problem is AMD does not update its driver as quick as Ubuntu updates its OS. This is quite irritating. So, I want the maximum resolution(and performance) that my graphics card and monitor can give me without installing the 'problematic' proprietary graphic card driver that AMD gives. Is this possible? Suggestions please. Thanks in advance. PS :- More system specs details:- Intel i3 2100 processor AMD P8H61-M PLUS2 motherboard AMD Radeon 5570 HD graphic card DELL monitor (BTW, Thank you for reading through my elaborate description!)

    Read the article

  • Develop in trunk and then branch off, or in release branch and then merge back?

    - by Torben Gundtofte-Bruun
    Say that we've decided on following a "release-based" branching strategy, so we'll have a branch for each release, and we can add maintenance updates as sub-branches from those. Does it matter whether we: develop and stabilize a new release in the trunk and then "save" that state in a new release branch; or first create that release branch and only merge into the trunk when the branch is stable? I find the former to be easier to deal with (less merging necessary), especially when we don't develop on multiple upcoming releases at the same time. Under normal circumstances we would all be working on the trunk, and only work on released branches if there are bugs to fix. What is the trunk actually used for in the latter approach? It seems to be almost obsolete, because I could create a future release branch based on the most recent released branch rather than from the trunk. Details based on comment below: Our product consists of a base platform and a number of modules on top; each is developed and even distributed separately from each other. Most team members work on several of these areas, so there's partial overlap between people. We generally work only on 1 future release and not at all on existing releases. One or two might work on a bugfix for an existing release for short periods of time. Our work isn't compiled and it's a mix of Unix shell scripts, XML configuration files, SQL packages, and more -- so there's no way to have push-button builds that can be tested. That's done manually, which is a bit laborious. A release cycle is typically half a year or more for the base platform; often 1 month for the modules.

    Read the article

  • Multiple vulnerabilities in Firefox

    - by Ritwik Ghoshal
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2012-3982 Denial of service (DoS) vulnerability 10.0 Firefox Solaris 10 SPARC: 145080-13 X86: 145081-12 CVE-2012-3983 Denial of service (DoS) vulnerability 10.0 CVE-2012-3986 Permissions, Privileges, and Access Controls vulnerability 6.4 CVE-2012-3988 Resource Management Errors vulnerability 9.3 CVE-2012-3990 Resource Management Errors vulnerability 10.0 CVE-2012-3991 Permissions, Privileges, and Access Controls vulnerability 9.3 CVE-2012-3992 Permissions, Privileges, and Access Controls vulnerability 5.8 CVE-2012-3993 Design Error vulnerability 9.3 CVE-2012-3994 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting') vulnerability 4.3 CVE-2012-3995 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4179 Resource Management Errors vulnerability 10.0 CVE-2012-4180 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4181 Resource Management Errors vulnerability 10.0 CVE-2012-4182 Resource Management Errors vulnerability 10.0 CVE-2012-4183 Resource Management Errors vulnerability 10.0 CVE-2012-4184 Permissions, Privileges, and Access Controls vulnerability 9.3 CVE-2012-4185 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4186 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4187 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4188 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 10.0 CVE-2012-4192 Permissions, Privileges, and Access Controls vulnerability 4.3 CVE-2012-4193 Design Error vulnerability 9.3 CVE-2012-4194 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting') vulnerability 4.3 CVE-2012-4195 Permissions, Privileges, and Access Controls vulnerability 5.1 CVE-2012-4196 Permissions, Privileges, and Access Controls vulnerability 5.0 This notification describes vulnerabilities fixed in third-party components that are included in Oracle's product distributions.Information about vulnerabilities affecting Oracle products can be found on Oracle Critical Patch Updates and Security Alerts page. Note: Solaris 10 patches SPARC: 145080-13 X86: 145081-12 contain the fix for all CVEs between Firefox version 10.0.7 and 10.0.12.

    Read the article

  • The balance between client and server functionality

    - by Eugen Martynov
    I want to bring the discussion that started in our teams and get your opinion about it. Assume we have an user account which could have different credentials for authentication and associated email to recover. An user has possibility to do signup with an email or use his social profile to complete signup process. As an Rest API from the backend to client looks like: Create account Authorise Update user data Link social account Register email Verify email In addition our BE is distributed and divided between several services/servers/clusters. So different calls are related to different end points. In case of the social sign up some of steps should be skipped or simplified. For example, with Facebook signup we could already skip email registration and verification step (we ask email permission form user), linking the social account and pre-fill user displayed name. So we proposed to have another end point which will hide/combine different calls on BE and return whole process result to the clients. The pros for this approach: No more duplication of functionality between clients Speed up the networking and user experience The cons for this approach: Additional work for backend Probably most complex scenarios in future updates I would like to get your opinion or experience with this situation. Especially if you already experienced point #2 from against reasons.

    Read the article

  • What will be the better way for data retrieval on application that needs to handle limited amount of data?

    - by Milanix
    Just moved this question from Stack Overflow. Since, adding my code snippets itself would make this question really long. Instead, I am pretty interested in knowing a better ways for data retrieval on application that needs to handle limited amount of data which isn't updated regularly. Let's take this example: I am writing an application which gets a schedule as an XML from server. I have written a logic in order to parse XML version and update database only if the version is newer than the local version. Although the update is checked automatically/manually on daily basis based on user preference, the actual version update happens only once per few months or so. Since, this is done by some other authority which doesn't provide API but, rather inform publicly on their changes. The actual XML contains a "(n number of groups)(days in a week) (n number of schedule)" . The group is usually 6 and the number of schedule is usually 2. So basically there would usually be only around 100 strings. Now although I have used SQLite at the moment. I want to know how to make update on database. Should I show progress dialog that the application is updating and exit the app when it's done? Since, my updates are infrequent i don't think this will really harm user experience but, is there any better ways to do it? Because I don't want update to be made when user is searching which is done using database. This will cause an database already open exception. At least I have faced this problem before. Is it better to rather parse XML every time when user wants to view certain things or to use SQLite? Since, I make lots of use of adapter in my app to create lists, will that degrade the performance?

    Read the article

  • Connecting / disconnecting DisplayPort causes crash

    - by iGadget
    I wanted to file a bug about this using ubuntu-bug xserver-xorg-video-intel, but the system prompted my to try posting here first. So here goes :-) While the situation in Ubuntu 11.10 was still somewhat workable (see UI freezes when disconnecting DisplayPort), in 12.04 (using Unity 3D) it has gotten worse. The weird part is that during the 12.04 beta's, the situation was actually improving! I was able to successfully connect and disconnect a DisplayPort monitor without the system breaking down on me. But now with 12.04 final (with all updates), it's just plain terrible. When I now connect an external monitor using the DisplayPort connector on my HP ProBook 6550b, it only works sometimes. Most times (but not always!) the screen just goes blank and the system seems to crash (not even CTRL+ALT+F1 works anymore). Only a hard shutdown by keeping the power button pressed for several seconds and then a restart gets me out of this. I suspect the chances of the system crashing become higher as the system's uptime increases, especially when there have been one or more suspend-resume cycles (although I have also experienced this bug once from a cold boot). Disconnecting is roughly the same as with 11.10 (see issue mentioned above), with the difference that if I resume from suspend, I no longer have to do a CTRL+ALT+F1, ALT+F7 cycle to get my screen back. So what more can I try? Or should I just go ahead and file the bug anyway?

    Read the article

  • Best way to update UI when dealing with data synchronization

    - by developerdoug
    I'm working on a bug at work. The app is written in Objective-C for iOS based device, for the iPad. I'm the new guy there and I've been given a hard task. Sometimes, the UIButton text property does not show the correct state when syncing. Basically, when the app is syncing, my UI control would say "Syncing" and when its not syncing it'll display "Updated @ [specific date]". Right now there is a property on the app delegate called "SyncInProgress". When querying / syncing, occurring on background thread, it updates a counter. The property will return a bool checking expression 'counter 0'. There are three states I need to deal with. Sync has started. Sync is updating tables. Sync finished. These items need to occur in order. My coworker suggested to take a state based approach instead of just responding to events. I'm not sure about how to go about that. Would it be best to have the UI receive a notification to determine what state its in or to pull every so often if state changed? Here are two posts that I put on stackoverflow, in the last few days, that relate to this. http://stackoverflow.com/questions/11025469/ios-syncing-using-a-state-approach-instead-of-just-reacting-to-events http://stackoverflow.com/questions/11037930/viewcontroller-when-viewwillappear-called-does-not-always-correctly-reflect-stat Any ideas that anyone might have to very much appreciated. Thanks, developerDoug

    Read the article

  • Gmail Now Supports Google Drive Integration; Share Files Up to 10GB

    - by Jason Fitzpatrick
    Gmail users can now easily send large files thanks to Google Drive’s increased integration with Gmail–blow through the 25MB in-email attachment limit and share files up to 10GB. From the official Gmail announcement: Have you ever tried to attach a file to an email only to find out it’s too large to send? Now with Drive, you can insert filesup to 10GB – 400 times larger than what you can send as a traditional attachment. Also, because you’re sending a file stored in the cloud, all your recipients will have access to the same, most-up-to-date version.  Like a smart assistant, Gmail will also double-check that your recipients all have access to any files you’re sending. This works like Gmail’s forgotten attachment detector: whenever you send a file from Drive that isn’t shared with everyone, you’ll be prompted with the option to change the file’s sharing settings without leaving your email. It’ll even work with Drive links pasted directly into emails.  The new Gmail/Drive integration is rolling out in waves to users over the next few days and is accessible via the new Gmail compose window. How To Use USB Drives With the Nexus 7 and Other Android Devices Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It

    Read the article

  • How to ensure nvidia_current module loads during boot

    - by Aras
    I am running Ubuntu 12.10 on an Asus G75V laptop with nvidia gforce GTX 660M. I first run 12.04 on this machine and was able to install nvidia_current drivers from swat ppa: sudo apt-add-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current This worked in 12.04 and after rebooting the machine my graphics where working properly. After upgrade to 12.10 however, the machine boots into a low resolution desktop which I can not really interact with. I suspect this is due to the driver not being loaded properly. To fix this, I have to switch to ctrl+alt+F1 session and manually load the nvidia_current module and restart the desktop manager: sudo modprobe nvidia_current sudo service lightdm restart Now everything works fine again. However, I would like not to have to do this every time I reboot the machine. I also dont want to hack an script to do this on load. Basically, if things are setup currectly, the nvidia_current driver which is installed should load. How can I make sure nvidia_current driver module loads properly when system starts?

    Read the article

  • GLES2.0 3D Android game performance and multi threading the update?

    - by Ofer
    I have profiled my mixed Java\C++ Android game and I got the following result: https://dl.dropbox.com/u/8025882/PompiDev/AndroidProfile.png As you can see, the pink think is a C++ functions that updates the game. It does things like updating the logic but it mostly it generates a "request list" for rendering. The thing is, I generate DrawLists on C++ and then send them to Java to process and draw using GLES2.0. Since then I was able to improve update from 9ms down to about 7ms, but I would like to ask if I would benefit from multi threading the update? As I understand from that diagram is that the function that takes the most time is the one you see it's color on the timeline. So the pink area is taken mostly by update. The other area has MainOpenGL.Handle as it's main contributor(whch is my java function), but since it's not drawn to the top of the diagram I can conclude other things are happening at the same time that use the CPU? Or even GPU stuff that isn't shown in this diagram. I am not sure how the GPU works on this. Does it calculate stuff in parallel to the CPU? Or is it part of the CPU usage as in SoC? I am not sure. Anyway, in case GPU things DO happen in parallel to CPU, then I would guess that if I do this C++ Update in parallel to the thread that makes the OpenGL calls, I might make use of "dead CPU time" due to GPU stalling or maybe have the GPU calls getting processed earlier because it won't have to wait for Update to finish? How do you suggest to improve performance based on that? Thanks.

    Read the article

  • What should I recommend a small company looking for C# developers

    - by Coder
    Here is the issue. I am a senior developer, and one of the start-ups I designed the system (management system/database/web) a long time ago, have grown and need software updates. I have left their system to another developer long time ago, but apparently he has left the job, and so they are asking me if I can suggest them where to find a new one. The problem is that the company has no clue that the IT is not cheap. They expect multiple features to be added for 40$, so that's an issue. Actually one of the reasons why I left the project when I did. Lots of expectations, little pay, also I know those people outside work, so I decided to avoided stressing the nonwork-relationships and left the project gracefully. Today they asked me for an advice, and I told them that the feature list they want is probably going to cost some if they'll get a senior developer for the job. So I guess their best bet is to find someone who loves coding and has just finished the school. Which would give someone a chance to code for money which is good for a student, and at the same time, allow the student to get some hands on experience. Then again, the system is not exactly 20 line console program, there is an MSSQL database, ASP.NET web page and content management system with all the AJAX stuff and some other things. So student straight out of school could have some problems with that. But, I thought about the issue some more, and I think that junior developer is a tricky deal, without mentoring, he can either screw up royally, or just do what's asked. Also, it seems no one is coming to interviews at all, which is weird, or maybe not. What should I suggest them?

    Read the article

  • How do I fix a terrible system error on ubuntu 12.04

    - by Anonymous
    I don't know what happened, but one day my computer had some sort of a system error and could no longer update itself. The software center will not open, it will begin to initialize and then a message pops up saying theres a system error and needs to shut down the software center. Then another box pops up after I go to report it saying it was unable to identify source or package name. I also can't extract a zipped folder to anything, or reinstall Ubuntu from a USB boot drive anymore, it keeps telling me my my computer isn't compatible when I know for a fact it is, because thats how I got Ubuntu on here in the first place. the only thing I know about this error is that a message popped up after I went to check for updates saying to report the problem and include this message in the report: 'E:malformed line 56 in source list /etc/apt/sources.list (dist parse)' it also called it a bug. I just want to know how to either get rid of the bug completely or find some way to be able to reinstall Ubuntu again. I know it's not a lot of information, but It's all I can give. Sorry.

    Read the article

  • User Switching in XFCE 12.04 with LightDM and dumping unneeccesary Gnome libs

    - by user111120
    I'm an elder non-techie Mac-to-Linux convert trying to play the linux tech game by ear, so please be gentle! :) I am running XFCE Ubuntu 12.04 totally on a 8-gig flash drive and it's fantastic. I am starting to run into potential space issues (down to 1.0 gig free from 1.9 gigs since being installed last summer), most likely because of growing Thunderbird mail files, and this prompted my question. I just installed lightDM on my system because I want the ability to switch users in XFCE if I follow instructions on another blog. They advised using LightDM instead of GDM because LightDM doesn't download Gnome libraries. That's great since I need the space, but my question is how can I tell whether I don't already have Gnome libraries installed from other updates and such? And can I minimize having any Gnome libraries? The method for me to switch users entails creating a "fast-user-switch" file in /usr/local/bin; is there any easier way? One last thing so I din't have topen another needless thread; while experimenting I somehow lost the share folder in one of my accounts. Is there any way to get a share folder back? Thanks for any tips! Jim in NYC

    Read the article

  • Caching by in-memory dictionaries. Are we doing it all wrong?

    - by user73983
    This approach is pretty much the accepted way to do anything in our company. A simple example : when a piece of data for a customer is requested from a service, we fetch all the data for that customer(relevant part to the service) and save it in a in-memory dictionary then serve it from there on following requests(we run singleton services). Any update goes to DB, then updates the in memory dictionary. It seems all simple and harmless but as we implement more complicated business rules the cache gets out of sync and we have to deal with hard to find bugs. Sometimes we defer writing to database, keeping new data in cache till then. There are cases when we store millions of rows in memory because the table has many relations to other tables and we need to show aggregate data quickly. All this cache handling is a big part of our codebase and I sense this is not the right way to do it. All of this juggling adds too much noise to the code and it makes it hard to understand the actual business logic. However I don't think we can serve data in a reasonable amount of time if we have to hit the database every time. I am unhappy about the current situation but I don't have a better alternative. My only solution would be to use NHibernate 2nd level cache but I have nearly no experience with it. I know many campanies use Redis or MemCached heavily to gain performance but I have no idea how I would integrate them into our system. I also don't know if they can perform better than in-memory data structures and queries. Are there any alternative approaches that I should look into?

    Read the article

  • MVC: How to Implement Linked Views?

    - by cw'
    I'm developing a java application to visualize time series. I need (at least) three linked views, meaning that interaction with one of them updates the others. The views are: A list represents the available and currently selected time series. The selected time series are used as input for subsequent computations. Changing the selection should update the other views. A line chart displays the available and selected time series. Time series should be selectable from here by clicking on them. A bar chart shows aggregated data on time series. When zooming in to a period of time, the line chart should zoom in to the same period (and vice versa). How to implement this nicely, from a software engineering point of view? I.e. I'd like to write reusable and clear, maintainable code. So I thought of the MVC pattern. At first I liked the idea of having the three view components observing my model class and to refresh views upon being notified. But then, it didn't feel right to store view related data in the model. Storing e.g. the time series selection or plot zoom level in the model makes implications about the view which I wouldn't want in a reusable model. On the other hand, having the controllers observe each other results in a lot of dependencies. When adding another view, I'd have to register all other views as observer of the new view and vice versa, passing around many references and introducing dependencies. Maybe another "model" storing only view-related data would be the solution?

    Read the article

  • What's hot in Oracle Premier Support News for Solaris, Storage and Systems - How to Patch!

    - by user12244613
    Struggling with locating patches for Sun products? Can't find your Oracle System Drivers? This question has been raised many times by customers and was the source of a short video in the Oracle System Support Newsletter in February 2012. The transition between SunSolve and My Oracle Support is to change how you think about the type of patch your looking for. For example, in SunSolve you might have typed e1000g if looking for an Enternet Driver.. but entering e1000g will not find anything in My Oracle Support - Patches and Update Menu. As you need to use the Product (Advanced) search which is driven of the Product Name, therefore you need to type "Ethernet" and select the ethernet product you are looking for to locate the patches for this product. Just to recap that video: If you are looking for the e1000g Ethernet Driver - You need to use Advance Search and search for Enternet 1. Log into My Oracle Support - Select Patches and Updates - Select Product or Family (Advanced Search). 2. In the product line enter: Ethernet and select the product name from the menu. 3. Check remove supersede patches - that ensure you only get relevant current patches in the results. 4. Select Search and the results are displayed. Now you have more options to include the platform (Solaris,Linux etc.) if want to further narrow the search. Need more information? Log into My Oracle Support and what a short 90sec video I put together. View the 7 minute Video using Firefox/chrome – It shows searching for individual patches, Solaris, Firmware etc. If you are not receiving the Oracle System Support Newsletter: Option (a) Within My Oracle Support, make document id: 1363390.1 a favourite and revisit it on the 2nd of each month for the latest content. Option (b) By default the Newsletter  is sent to all customers who have logged a Service Request on an Oracle Systems Hardware Product during the last 12months, unless you have opted out to receiving Oracle Communications on your profile on http://oracle.com

    Read the article

  • How do you track existing requirements over time?

    - by CaptainAwesomePants
    I'm a software engineer working on a complex, ongoing website. It has a lot of moving parts and a small team of UI designers and business folks adding new features and tweaking old ones. Over the last year or so, we've added hundreds of interesting little edge cases. Planning, implementing, and testing them is not a problem. The problem comes later, when we want to refactor or add another new feature. Nobody remembers half of the old features and edge cases from a year ago. When we want to add a new change, we notice that code does all sorts of things in there, and we're not entirely sure which things are intentional requirements and which are meaningless side effects. Did someone last year request that the login token was supposed to only be valid for 30 minutes, or did some programmers just pick a sensible default? Can we change it? Back when the product was first envisioned, we created some documentation describing how the site worked. Since then we created a few additional documents describing new features, but nobody ever goes back and updates those documents when new features are requested, so the only authoritative documentation is the code itself. But the code provides no justification, no reason for its actions: only the how, never the why. What do other long-running teams do to keep track of what the requirements were and why?

    Read the article

  • Ubuntu 12.04 won't shut down - stopping winbind daemon

    - by jan
    My Precise Pangolin sometimes won't shut down - the screen is black with text on it. Mostly last line says something like "stopping winbind deamon" (sometimes also virtualbox, which is above winbind daemon; edit: sometimes the last line says "running unattended updates") and it stays like this for about ten miutes. Then I usually hold the power button for 5s to shut it down. It's very unpredictable - sometimes the computer shuts down without problem and sometimes it hangs. I've tried many ways to shut it down: HW button, panel applet, sudo shutdown -h now, sudo poweroff, sudo halt, etc. even sudo reboot or restart from panel applet have this problem. Sometimes it works ok but every method named hung at least once on the same (damned) line. My specs: FUJITSU SIEMENS LIFEBOOK E8310, Intel Core2 Duo T7300 @ 2.00GHz, 3GB RAM, GPU: Mobile Intel(R) 965 Express Chipset Family Ubuntu 12.04.2 32bit, 3.5.0-41-generic kernel (but it did it on older kernels and 12.04.x systems too). Any ideas what should I try next? Thanks a lot! Jan

    Read the article

  • hd0 out of disk error results to low graphics mode

    - by msPeachy
    Yesterday, I have reinstalled Ubuntu due to a error: hd0 out of disk on boot. Everything went fine, I've installed apps, perform updates and upgraded the kernel. I've even restarted it a few times just to check if I would encounter boot issues and was glad that everything was working perfectly, then powered it down. The next morning when I boot, I got this error: hd0 out of disk error. Press any key to continue... again! After pressing a key, it took 10 minutes for the Ubuntu logo to appear with it's 5 dots. After another 5 minutes, Ubuntu started checking the disk and displayed a message that / has errors, I pressed F to fix the errors. After which Ubuntu tells me that /tmp is not yet ready for mounting so I pressed S to skip mounting it, then Ubuntu restarted. On boot I saw the error: hd0 out of disk error. Press any key to continue... again. This time it took only a minute for the Ubuntu logo to appear and after another minute a dialog box appeared with the following message: The system is running in low-graphics mode. Your screen, graphics card, and input settings could not be detected correctly. You will have to configure these yourself. What would you like to do? Run in low-graphics mode for just one session Reconfigure graphics Troubleshoot the error Exit to console login Whichever option I choose I ended up with a console prompt: grub-editenv: error: cannot read the file /boot/grub/grubenv. _ I can't do anything on this console, whatever I type nothing happens. I've rebooted several times and I get same error every time. I don't quite understand what is wrong with Ubuntu or with my installation. I've encountered this hd0 out of disk error several times already and always ended up reinstalling. I'd really really appreciate it if you guys can help me fix this. Thank you and good day.

    Read the article

  • Custom distro using ubuntu 12.04

    - by user89707
    I am creating the custom operating system using the ubuntu 12.04. When ubuntu login from the light dm -- it shows ubuntu desktop . i need to change to the my os name. I need to replace the ambaince dark icon to fs icon by default for all the login and live cd. How to permanentely change the os name It should not change even the customer update the operating system too. I am using the remastersys. I am looking to develop the new distro. like mint ,, If i had an breif explanation of the creation of the repository and maintaining the updates . it will be more helpfull. Kindly provind the link for creating the full fledged os based on the ubuntu .. like mint, Snowlinux, etc did.. replace the grub with burg for default installation If remastersys is not good . then provide me some other tool to create . I am not having the high speed internet

    Read the article

  • Grub2 won't detect Ubuntu 11.10 OS after reinstalling Win XP hal.dll.

    - by yoopian
    Hi I'm an Ubuntu newbie here. I've installed ubuntu 11.10 to dual boot on a single HDD. I did a manual partition and basically forgot all the on what sda my /boot partition is. My installation worked out just fine and I tried to install updates with it. After a while I when I wanted to boot to windows it showed that I was missing a "hal.dll" file. I've fixed this problem using the windows resource CD but then after booting up my PC it went straight to Windows XP. I've tried to manually reinstall Grub2 using a Live CD/USB and it worked but I think I have installed in on a different "sda#" (sda5 to be exact) because even though Grub2 loads when I boot my PC, only windows XP shows up as my OS and Ubuntu 11.10 is missing. Now, I've tried installing boot-repair to solve my problems using Live CD/USB. Boot-repair tells me that boot configuration was successful but then a basic grub interface shows up (the black one with a command line grub showing up. Now I can't even boot to Windows XP. Any help would be really appreciated. BTW here's the notes from boot repair that I was asked to save: http://paste.ubuntu.com/890228/ As you can see there are boot files on sda5 and sda7. I think that's the core problem that I have right now. Thanks in advance!

    Read the article

< Previous Page | 171 172 173 174 175 176 177 178 179 180 181 182  | Next Page >