Search Results

Search found 6068 results on 243 pages for 'goal tracking'.

Page 128/243 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Designing a system with different business rules for different customers

    - by user1595846
    My company is rewriting our proprietary business application. The current architecture is poorly done and inflexible. It is coded more procedural oriented as opposed to object oriented. It has become difficult to maintain. Our system is a web application written in .Net Webforms. I am considering ASP.Net MVC for the rewrite. We intend to rewrite it with a good, solid architecture with the goal of maintainability and reusable classes for some of our other systems and services. We would also like the system to be customizable for different customers in the event that we market the system. I am considering redesigning the system based on the layered architecture (Presentation, Business, Data Access layers) described in the Microsoft Patterns and Practices Application Architecture Guide. http://msdn.microsoft.com/en-us/library/ff650706.aspx Hopefully this isn't too open ended, but how would you recommend allowing for different business logic/rules for different customers? I'm aware of Windows Workflow Foundation, but from what I've read about it, it seems many business rules could be too complicated to handle there. Also, Can anyone point me to where I can download an example of a .net solution that is based on the Application Architecture Guide? I have already downloaded the Layered Architecture Solution Guidance and the Expense Sample on codeplex. I was looking for something a bit larger and more robust that I could step through the code and see how it works. If you feel there are better architectures to base our redesign on please feel free to share. I appreciate your help!

    Read the article

  • Neural network input preprocessing

    - by TND
    It's clear that the effectiveness of a neural network depends strongly on the format you give it to work with. You want to preprocess it into the most convenient form you can algorithmically get to, so that the neural network doesn't have to account for that itself. I'm working on a little project that (surprise!) is going to be using neural networks. My future goal is to eventually use NEAT, which I'm really excited about. Anyway, one of my ideas involves moving entities in continuous 2D space, from a top-down perspective (this would be a really cool game AI). Of course, unless these guys are blind, they're going to be able to see the world around them. There's a lot of different ways this information could be fed into the network. One interesting but expensive way is to simply render a top-down "view" of things, with the entities as dots on the picture, and feed that in. I was hoping for something much simpler to use (at least at first), such as a list of the x (maybe 7 or so) nearest entities and their position in relative polar coordinates, orientation, health, etc., but I'm trying to think of the best way to do it. My first instinct was to order them by distance, which would inherently also train the neural network to consider those more "important". However, I was thinking- what if there's two entities that are nearly the same distance away? They could easily alternate indexes in that list, confusing the network. My question is, is there a better way of representing this? Essentially, the issue is the network needs a good way of keeping track of who's who, while knowing (by being inputted) relevant information about the list of entities it can see. Thanks!

    Read the article

  • Modular Architecture for Processing Pipeline

    - by anjruu
    I am trying to design the architecture of a system that I will be implementing in C++, and I was wondering if people could think of a good approach, or critique the approach that I have designed so far. First of all, the general problem is an image processing pipeline. It contains several stages, and the goal is to design a highly modular solution, so that any of the stages can be easily swapped out and replaced with a piece of custom code (so that the user can have a speed increase if s/he knows that a certain stage is constrained in a certain way in his or her problem). The current thinking is something like this: struct output; /*Contains the output values from the pipeline.*/ class input_routines{ public: virtual foo stage1(...){...} virtual bar stage2(...){...} virtual qux stage3(...){...} ... } output pipeline(input_routines stages); This would allow people to subclass input_routines and override whichever stage they wanted. That said, I've worked in systems like this before, and I find the subclassing and the default stuff tends to get messy, and can be difficult to use, so I'm not giddy about writing one myself. I was also thinking about a more STLish approach, where the different stages (there are 6 or 7) would be defaulted template parameters. Can anyone offer a critique of the pattern above, thoughts on the template approach, or any other architecture that comes to mind?

    Read the article

  • getting started as a web developer [closed]

    - by kmote
    I have over 10 years of programming experience building (Windows-based) desktop applications and utilities (VC++, C#, Python). My goal over the next year is to start transitioning to web application development. I want to teach myself the fundamental tools and technologies that would be considered essential for building professional, online, interactive, visually-stunning, data-driven web apps -- the kind described in Google's recently released "Field Guide: Building Great Web Applications". So my question is, what are the primary, most commonly-used technologies that seasoned professionals will need in their tool belt in the coming years? My plan was to start coming up to speed in Javascript, HTML5, & CSS, and then to do a deep dive into ASP.NET and Ajax, along with SQL DBs. (I was surprised to not be able to find a single book at Amazon with a broad, general scope like this, which caused me to start second-guessing this approach.) So, seasoned professionals: am I on the right track? Are there some glaring omissions in my list? Or some unnecessary inclusions? I would welcome any book suggestions along these lines as well.

    Read the article

  • Are there design patterns or generalised approaches for particle simulations?

    - by romeovs
    I'm working on a project (for college) in C++. The goal is to write a program that can more or less simulate a beam of particles flying trough the LHC synchrotron. Not wanting to rush into things, me and my team are thinking about how to implement this and I was wondering if there are general design patterns that are used to solve this kind of problem. The general approach we came up with so far is the following: there is a World that holds all objects you can add objects to this world such as Particle, Dipole and Quadrupole time is cut up into discrete steps, and at each point in time, for each Particle the magnetic and electric forces that each object in the World generates are calculated and summed up (luckily electro-magnetism is linear). each Particle moves accordingly (using a simple estimation approach to solve the differential movement equations) save the Particle positions repeat This seems a good approach but, for instance, it is hard to take into account symmetries that might be present (such as the magnetic field of each Quadrupole) and is this thus suboptimal. To take into account such symmetries as that of the Quadrupole field, it would be much easier to (also) make space discrete and somehow store form of the Quadrupole field somewhere. (Since 2532 or so Quadrupoles are stored this should lead to a massive gain of performance, not having to recalculate each Quadrupole field) So, are there any design patterns? Is the World-approach feasible or is it old-fashioned, bad programming? What about symmetry, how is that generally taken into acount?

    Read the article

  • Move unity launcher to bottom of the screen

    - by argvar
    I have Ubuntu 13.04 DESKTOP version and for some odd reason I'm told that the Unity launcher cannot be moved to the bottom of the screen because of several reasons: 1. Canonical wants it there so it fits with their overall design goals, namely when it comes to touchscreen devices and netbooks. This in my mind totally ignores the fact that most Ubuntu users are DESKTOP users. No matter what Canonicals long term goal is, it surely mustn't be at the expense of needs of their core user base. 2. Most monitors are widescreen, the launcher is more compact where it is. This is not only taking away the users choice, but is also a wrong assessment. Widescreen monitors can sometimes be rotated on a pivot, giving it a portrait aspect. By displaying the Unity launcher on the left side it takes up a lot of space. Many desktop users have multiple monitors, and having the launcher on the left side of each monitor is very awkward. Also, many websites are catered to fit on a half 1920 display, so you can have two browser windows open side-by-side with all content visible. The placement of the Unity launcher takes away the horizontal space meaning there's less room for each browser window, and you'll see the right side of the web pages being occluded. Any suggestion to simply hide the Unity launcher, or "Canonical knows best" or "get used to it" are unwelcome and totally ignores the above points. Linux is about choice. Canonical's stubbornness with the Unity launcher placement is inconsistent with what Linux is about.

    Read the article

  • Transparent JPanel, Canvas background in JFrame

    - by Andy Tyurin
    I wanna make canvas background and add some elements on top of it. For this goal I made JPanel as transparent container with setOpaque(false) and added it as first of JFrame container, then I added canvas with black background (in future I wanna set animation) to JFrame as second element. But I can't undestand why i see grey background, not a black. Any suggestions? public class Game extends JFrame { public Container container; //Game container with components public Canvas backgroundLayer; //Background layer of a game public JPanel elementsLayer; //elements panel (top of backgroundLayer), holds different elements private Dimension startGameDimension = new Dimension(800,600); //start game dimension public Game() { //init main window super("Astra LaserForces"); setSize(startGameDimension); setBackground(Color.CYAN); container=getContentPane(); container.setLayout(null); setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); //init jpanel elements layer elementsLayer=new JPanel(); elementsLayer.setSize(startGameDimension); elementsLayer.setBackground(Color.BLUE); elementsLayer.setOpaque(false); container.add(elementsLayer); //init canvas background layer backgroundLayer = new Canvas(); backgroundLayer.setSize(startGameDimension); backgroundLayer.setBackground(Color.BLACK); //set default black color container.add(backgroundLayer); } //start game public void start() { setVisible(true); } //create new instance of game and start it public static void main(String[] args) { new Game().start(); } }

    Read the article

  • Finding the best practice for a game simulating tool

    - by Tougheart
    I'm studying Java right now, and I'm thinking of this tool as my practice project. The game is "League of Legends" in case anyone knows it, I'm not actually simulating the game as in simulating game play, I'm just trying to create a tool that can compare different champions to each other based on their own abilities and items bought inside the game. The game basics are: Every player has a champion in a team of 5 players playing against another team. Each champion has a different set of abilities (usually 4) that s/he uses to do damage to opposing champions. Each champion gets stronger by buying different items, increasing the attack it deals or decreasing the damage received. What I want to do is to create a tool to be used outside the game enabling players to try out different builds for their champions and compare the figures against other champions they usually fight against. The goal is to enable players get a deeper understanding of the different item combinations (builds) that can be used during the games, instead of trying them out in real games which can be somehow very time consuming. What I'm stuck at is the best practice I should follow to make this possible using Java, I can't figure out which classes should inherit from which, should I make champions and items specs in the code or extracted from other files, specially that I'm talking about hundreds of items and champions to use in that tool. I'm self studying Java, and I don't have much practice at it, so I would really appreciate any broad guidelines regarding this, and sorry if my question doesn't fit here, I tried to follow the rules. English isn't my native language, so I'm really sorry if I wasn't clear enough, I would be more than happy to explain anything that's not understood.

    Read the article

  • How to create a bootable system with a squashfs root

    - by cldfzn
    My goal is to be able to take a customized root file system loaded with the software I want. So far I've created a squashed filesystem using debootstrap and chroot to install the software I want on the system. The problem I am now running in to.. whenever I boot in to the system, my user accounts that were set up in the chroot do not work. First boot everything works out, second boot I can't log in. That is baffling to me. Any one know a reason or a place to start looking? Update To get a working system with a squashfs filesystem: sudo apt-get install live-boot live-boot-initramfs-tools extlinux sudo update-initramfs -u Create a squashfs file from a bootstrapped or running ubuntu filesystem with whatever packages you want available. https://help.ubuntu.com/community/LiveCDCustomizationFromScratch provides good instructions for creating a debootstrapped system to build on. Format the target drive with ext2/3/4 and enable the bootable flag. Create the folder layout on the target drive and install extlinux: mkdir -p ${TARGET}/boot/extlinux ${TARGET}/live extlinux -i ${TARGET}/boot/extlinux dd if=/usr/lib/syslinux/mbr.bin of=/dev/sdX #X is the drive letter cp /boot/vmlinuz-$(uname -r) ${TARGET}/boot/vmlinuz cp /boot/initrd.img-$(uname -r) ${TARGET}/boot/initrd cp filesystem.squashfs ${TARGET}/live Create ${TARGET}/boot/extlinux/extlinux.conf with the following contents: DEFAULT Live LABEL Live KERNEL /boot/vmlinuz APPEND initrd=/boot/initrd boot=live toram=filesystem.squashfs TIMEOUT 10 PROMPT 0 Now you should be able to boot from the target drive in to your squashed system.

    Read the article

  • Mozilla Persona to the login rescue?

    - by Matt Watson
    A lot of website now allow us to login or create accounts via OAuth or OpenID. We can use our Facebook, Twitter, Google, Windows Live account and others. The problem with a lot of these is we have to have allow the websites to then have access to our account and profile data that they shouldn't really have. Below is a Twitter authorization screen for example when signing in via Technorati. Now Technorati can follow new people, update my profile and post tweets? All I wanted to do was login to Technorati.com to comment on a post!Mozilla has just released their new solution for this called Persona. First thought is oh great another solution! But they are actually providing something a little different and better. It is based on an email address and isn't linked to anything like our personal social networks or their information. Persona only exists to help with logging in to websites. No loose strings attached.Persona is based on a new standard called BrowserID and you can read more about it here:How BrowserID Works.  The goal is to integrate BrowserID in to the browser at a deeper level so no password entry is required at all. You can tell your web browser to just auto sign in for you. I am really hoping this takes off and will look at implementing it in current projects! I would recommend researching it and lets hope it or something like it becomes a wide spread reality in the future.

    Read the article

  • Clutter for game GUI

    - by tjameson
    I'm pretty new to game development, having only written a simple 3d game for a class project, but I'd like to get started on a bigger project. I'm writing an MMORPG to run in both the browser (WebGL) and natively (OpenGL ES 2). In choosing a GUI toolkit, I'm trying to find a style that work work natively and would be simple to emulate in WebGL. I am considering using D or Go for writing my game, so interfacing with C++ libraries will be difficult, if not impossible. Of course, the language isn't the end goal here, so if using C++ will save considerable time, I'll bite the bullet and use that. In order to reduce the amount of code I'll have to write for the browser, I'm considering using something simple like Clutter for basic abstractions, which I think will be pretty easy to emulate (layered canvases maybe?). Does anyone have experience using Clutter for a 3d game? Note: I haven't used any game development libraries, and I only have limited experience with GUI libraries. I do have HTML+CSS experience, so maybe librocket is a viable solution?

    Read the article

  • Are there any good examples of open source C# projects with a large number of refactorings?

    - by Arjen Kruithof
    I'm doing research into software evolution and C#/.NET, specifically on identifying refactorings from changesets, so I'm looking for a suitable (XP-like) project that may serve as a test subject for extracting refactorings from version control history. Which open source C# projects have undergone large (number of) refactorings? Criteria A suitable project has its change history publicly available, has compilable code at most commits and at least several refactorings applied in the past. It does not have to be well-known, and the code quality or number of bugs is irrelevant. Preferably the code is in a Git or SVN repository. The result of this research will be a tool that automatically creates informative, concise comments for a changeset. This should improve on the common development practice of just not leaving any comments at all. EDIT: As Peter argues, ideally all commit comments would be teleological (goal-oriented). Practically, if a comment is made at all it is often descriptive, merely a summary of the changes. Sadly we're a long way from automatically inferring developer intentions!

    Read the article

  • Using R on your Oracle Data Warehouse

    - by jean-pierre.dijcks
    Since it is Predictive Analytics World in our backyard (or are we San Francisco’s backyard…?) I figured it is well worth the time to dust of some old but important news. With big data (should we start calling it “any data analytics” instead?) being the buzz word and analytics the key operative goal, not moving data around is becoming more and more critical to the business users. Why? Because instead of spending time on moving data around into your next analytics server you should be running analytics on those CPUs. You could always do this with Oracle Data Mining within the Oracle Database. But a lot of folks want to leverage R as their main tool. Well, this article describes how you can do this, since 2010… As Casimir Saternos concludes in the article; “There is a growing awareness of the need to effectively analyze astronomical amounts of data, much of which is stored in Oracle databases. Statistics and modeling techniques are used to improve a wide variety of business functions. ODM accessed using the R language increases the value of your data by uncovering additional information. RODM is a powerful tool to enable your organization to make predictions, classify data, and create visualizations that maximize effectiveness and efficiencies.” Happy Analysis!

    Read the article

  • The Real Value Of Certification

    - by Brandye Barrington
    I read a quote recently by Rich Hein of CIO.com "Certifications are, like most things in life: The more you put into them, the more you will get out." This is what we tell candidates all the time. The real value in obtaining a certification is the time spent preparing for the exam. All the hours spent reading books, practicing in hands-on environments, asking questions and searching for answers is valuable. It's valuable preparation for the exam, but it's also valuable preparation for your future job role and for your career. If your goal is just to pass an exam, you've missed a very important part of the value of certification.We receive so many questions through different forms of social media on whether or not certification will help candidates get jobs or get better jobs. Surveys conducted by us and by independent entities all point to the job and salary benefits of certification. However, a key part of that equation is whether a candidate can actually perform successfully in a job role. If preparation time was used to practice and learn and master new skills rather than to memorize a brain dump, the candidate will probably perform successfully in their job role, and job opportunities and higher salary will likely follow. Candidates who do not show that initiative, will not likely reap the full benefits of certification.Keep this in mind as you approach your next certification exam. You are preparing for a career, not an exam. This may help you to be more appreciative of the long hours spent studying!

    Read the article

  • Drawing a texture line between two vectors in XNA WP7

    - by Krav
    I want to create a simple graph maker in WP7. The goal is to draw a texture line between two vectors what the user defines with touch. I already made the rotation, and it is working, but not correctly, because it doesn't calculate the line's texture height, and because of that, there are too many overlapping textures. So it does draw the line, but too many of them. How could I calculate it correctly? Here is the code: public void DrawLine(Vector2 st,Vector2 dest,NodeUnit EdgeParent,NodeUnit EdgeChild) { float d = Vector2.Distance(st, dest); float rotate = (float)(Math.Atan2(st.Y - dest.Y, st.X - dest.X)); direction = new Vector2(((dest.X - st.X) / (float)d), (dest.Y - st.Y) / (float)d); Vector2 _pos = st; World.TheHive.Add(new LineHiveMind(linetexture, _pos, rotate, EdgeParent, EdgeChild,new List<LineUnit>())); for (int i = 0; i < d; i++) { World.TheHive.Last()._lines.Add(new LineUnit(linetexture, _pos, rotate, EdgeParent, EdgeChild)); _pos += direction; } } d is for the Distance of the st (Starting node) and dest (Destination node) rotate is for rotation direction calculates the direction between the starting and the destination node _pos is for starting position changing Thanks for any suggestions/help!

    Read the article

  • Will Unity skills be interchangeable?

    - by Starkers
    I'm currently learning Unity and working my way through a video game maths primer text book. My goal is to create a racing game for WebGL (using Three.js and maybe Physic.js). I'm well aware that the Unity program shields you from a lot of what's going on and a lot of the grunt work attached to developing even a simple video game, but if I power through a bunch of Unity tutorials, will a lot of the skills I learn translate over to other frameworks/engines? I'm pretty proficient at level design with WebGL, and I'm a good 3D modeller. My weaknesses are definitely AI and Physics. While I am rapidly shoring up my math, and while Physics is undeniably interesting there's only so many hours in the day and there's a wealth of engines out there to take care of this sort of thing. AI does appeal to me a lot more, and is a lot more necessary. AI changes drastically from game to game, is tweaked heavily during development, and the physics is a lot more constant. Will leaning AI concepts in Unity allow me to transfer this knowledge pretty much anywhere? Or will I just be paddling up Unity creek with these skills?

    Read the article

  • Getting into the details of game engine programming

    - by Darkslash
    I am interested in learning game programming, but I really have an interest in the lower level engineering in games. I have OpenGL experience, and I am really interested in learning more about implementing AI, Physics, etc. I have a computer science degree, so I really like getting into technical stuff. Many times when I ask about this sort of thing, I get a lot of "Use an engine", "Use Unity3d", "Why waste your time writing code that already exists", etc, etc. My idea was to use simpler libraries such as SFML or XNA so that I could learn how to implement the more complex systems. The thing is, although I do want to write games, I want to learn things that using something like Unity simply doesn't teach you. My goal is not to make a current generation quality 3D game to sell, I just want to make some cool smaller games and learn all I can about the programming side of game development. Is this something that people just do not do anymore? It seems like everywhere I turn people are using Unity or UDK or GameMaker. I fully understand why you would use a tool like these, but I cant see how they would suit my purposes. So where does someone like myself turn? Am I trying to learn something that people just do not bother doing anymore? Is the innovation in this area gone and just all about gameplay now? I'm sorry if this question seems silly, but I am genuinely interested in knowing more about this and meeting more people who are interested in this sort of thing.

    Read the article

  • Ubuntu 12.04 hangs on install using VMware Player on Windows 7

    - by Beauness_Round
    At my job, we're migrating to Ubuntu 12.04 from 10.10 (we skipped a generation), so everyone needs to make one or more new Virtual Machines for 12.04 on their Windows 7 workstation. The problem is, only some have had success while most encounter a hang during install. I was tasked with researching this problem and here are my findings (while somewhat inconclusive) I have tried uninstalling and reinstalling VMware Player, using versions 4.0.2, 4.0.4, and 5.0.0 but every install hangs at "retrieving file 55 of 129." Additionally I have installed before updating the VMware tools, as well as after (one can pre-download them) with every install hanging at "retrieving file 55 of 129." The file i was using for the above two is called: ubuntu-12.04-desktop-amd64.iso I have burned a disc with ubuntu-12.04.1-desktop-amd64.iso and tried to install it, and later tried ubuntu-12.04.1-desktop-i386.iso (which is not what i want but i wanted to see if it would work), but both times I get "PXE-E53: No boot filename recieved PXE-M0F: Exiting Intel PXE ROM. Operating System not found" This is all in VMware and the goal is to find a settings recipe that will allow an unbroken 64bit Ubuntu 12.04 install from an .iso hosted on the Windows OS. Thank you in advance to anyone with ideas!

    Read the article

  • Writing generic code when your target is a C compiler

    - by enobayram
    I need to write some algorithms for a PIC micro controller. AFAIK, the official tools support either assembler or a subset of C. My goal is to write the algorithms in a generic and reusable way without losing any runtime or memory performance. And if possible, I would like to do this without increasing the development time much and compromising the readability and maintainability much either. What I mean by generic and reusable is that I don't want to commit to types, array sizes, number of bits in a bit field etc. All these specifications, IMHO, point to C++ templates, but there's no compiler for it for my target. C macro metaprogramming is another option, but, again my opinion, that greatly reduces readability and increases development time. I believe what I'm looking for is a decent C++ to C translator, but I'd like to hear anything else that satisfies the above requirements. Maybe a translator from another high-level language to C that produces very efficient code, maybe something else. Please note that I have nothing against C, I just wish templates were available in it.

    Read the article

  • How to divide work to a network of computers?

    - by Morpork
    Imagine a scenario as follows: Lets say you have a central computer which generates a lot of data. This data must go through some processing, which unfortunately takes longer than to generate. In order for the processing to catch up with real time, we plug in more slave computers. Further, we must take into account the possibility of slaves dropping out of the network mid-job as well as additional slaves being added. The central computer should ensure that all jobs are finished to its satisfaction, and that jobs dropped by a slave are retasked to another. The main question is: What approach should I use to achieve this? But perhaps the following would help me arrive at an answer: Is there a name or design pattern to what I am trying to do? What domain of knowledge do I need to achieve the goal of getting these computers to talk to each other? (eg. will a database, which I have some knowledge of, be enough or will this involve sockets, which I have yet to have knowledge of?) Are there any examples of such a system? The main question is a bit general so it would be good to have a starting point/reference point. Note I am assuming constraints of c++ and windows so solutions pointing in that direction would be appreciated.

    Read the article

  • Fun programming or something else?

    - by gion_13
    I've recently heard about android's isUserAGoat method and I didn't know what to think. At first I laughed my brains out, than I was embarrassed for my lack of professionalism and tried to look into it and see if it makes any normal sense. As it turns out it is a joke (as stated here) and it appears that other languages/apis have these sort of easter eggs implemented in their core. While I personally like them and feel they can be a fresh breath sometimes, I think that they also can be both frustrating and confusing (and you begin to ask yourself : "can users be goats?" or "I get it! "goat" is slang for.... wait.."). My question is are there any other examples of these kind of programming jokes and what are their intends? Should they be considered harmless or not (how do programmers feel about it) ? Do they reach their goal (if any other than to laugh) ? Where do you draw a line between a good joke and a disaster? (what if the method was called isUserStupid?)

    Read the article

  • How to run around another football player

    - by Lumis
    I have finished a simple 2D one-on-one indoor football Android game. The thing that it seemed so simple to me, a human being, turned out to be difficult for a computer: how to go around the opponent … At the moment the game logic of the computer player is that if it hits into the human player will step back few points on the pixel greed and then try again to go towards the ball. The problem is if the human player is in-between then the computer player will oscillate in one place, which does not look very nice and the human opponent can use this weakness to control the game. You can see this in the photo – at the moment the computer will go along the red line indefinitely. I tried few ideas but it proved not easy to do it when both the human player and the ball are constantly moving so at each step computer would change directions and “oscillate” again. Once when the computer player reaches the ball it will kick it with certain amount of random strength and direction towards the human’s goal. The question here is how to formulate the logic of going around the ever moving human opponent and how to translate it into the co-ordinate system and frame by frame animation… any suggestions welcome.

    Read the article

  • How do I upgrade from ubuntu 9.10 to 12.10 on my Acer Aspire 3000

    - by 770
    I had my Acer Aspire 3000 as a dual boot XP/ubuntu 9.10 a couple years ago. I recently blew the dust of it and wanted to upgrade to 7/Ubuntu 12.10 so I began by formatting the Ubuntu side of the partition and apparently damaged the mbr as I could only get black screen with the error message: GRUB loading. error: no such partition grub rescue I then slaved the hdd to my win7 desktop and formatted the entire drive, both sides of the partition then reinstalled it in the Acer and tried to install win7. Upon starting the Acer I got the same error message: GRUB loading. error: no such partition grub rescue I then tried to reinstall Ubuntu 9.10 as I have an Ubuntu produced installation cd. Same result. Next day I received a new battery I had ordered for the Acer. I plugged it and the power supply in and hit the power button just to see if I at least could charge the battery but to my surprise Ubuntu 9.10 began to install, so I let it and it did. Now the hard drive shows 58 gb and 2.5gb partitions neither of which is formatted NTFS for/by windows. I am guessing that the GRUB/mbr was repaired somehow by the Ubuntu reinstallation. My question, should you choose to accept it; How can I get to my goal of dual boot win7/Ubuntu 12.10. I am a beginner and don't know much about linux or the terminology. Thank you for your thoughts and help.

    Read the article

  • Oracle Transportation Management (Lead) Functional Consultant in Germany

    - by user769227
    My name is Giovanni and I lead the practice of OTM (Oracle Transportation Management) consultants in Western Europe. I currently have a role open for an OTM Lead Consultant to join my international team in Germany. Oracle Transportation Management is the leading TMS application software in the market, as confirmed by Gartner’s classification as LEADER of its TMS Magic Quadrant with the highest rating among vendors. The OTM Consulting practice is a team of OTM functional and technical specialists located across Europe whose broad objective is to assist companies in the implementation of their TMS solution based on OTM. These companies are leading Shippers of various industries and Logistic Service Providers. Key requirements for this role are: relevant experience with Supply Chain or Transportation Management in other consulting organizations or large enterprises, the drive to learn the leading TMS application software in today’s market and the interest to join a truly international team. We offer the opportunity to work for a leader of the IT Industry and assist international clients to realize their business transformation initiatives through innovation. If you have an entrepreneurial spirit, and are you looking for a work culture where innovation is the goal, hard work is expected, and creativity is rewarded then please visit this link for more information.

    Read the article

  • Layers - Logical seperation vs physical

    - by P.Brian.Mackey
    Some programmers recommend logical seperation of layers over physical. For example, given a DL, this means we create a DL namespace not a DL assembly. Benefits include: faster compilation time simpler deployment Faster startup time for your program Less assemblies to reference Im on a small team of 5 devs. We have over 50 assemblies to maintain. IMO this ratio is far from ideal. I prefer an extreme programming approach. Where if 100 assemblies are easier to maintain than 10,000...then 1 assembly must be easier than 100. Given technical limits, we should strive for < 5 assemblies. New assemblies are created out of technical need not layer requirements. Developers are worried for a few reasons. A. People like to work in their own environment so they dont step on eachothers toes. B. Microsoft tends to create new assemblies. E.G. Asp.net has its own DLL, so does winforms. Etc. C. Devs view this drive for a common assembly as a threat. Some team members Have a tendency to change the common layer without regard for how it will impact dependencies. My personal view: I view A. as silos, aka cowboy programming and suggest we implement branching to create isolation. C. First, that is a human problem and we shouldnt create technical work arounds for human behavior. Second, my goal is not to put everything in common. Rather, I want partitions to be made in namespaces not assemblies. Having a shared assembly doesnt make everything common. I want the community to chime in and tell me if Ive gone off my rocker. Is a drive for a single assembly or my viewpoint illogical or otherwise a bad idea?

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >