Search Results

Search found 23311 results on 933 pages for 'multiple computers'.

Page 402/933 | < Previous Page | 398 399 400 401 402 403 404 405 406 407 408 409  | Next Page >

  • How sophisticated should be DAL?

    - by Andrew Florko
    Basically, DAL (Data Access Layer) should provide simple CRUD (Create/Read/Update/Delete) methods but I always have a temptation to create more sophisticated methods in order to minimize database access roundtrips from Business Logic Layer. What do you think about following extensions to CRUD (most of them are OK I suppose): Read: GetById, GetByName, GetPaged, GetByFilter... e.t.c. methods Create: GetOrCreate methods (model entity is returned from DB or created if not found and returned), Create(lots-of-relations) instead of Create and multiple AssignTo methods calls Update: Merge methods (entities list are updated, created and deleted in one call) Delete: Delete(bool children) - optional children delete, Cleanup methods Where do you usually implement Entity Cache capabilities? DAL or BLL? (My choice is BLL, but I have seen DAL implementations also) Where is the boundary when you decide: this operation is too specific so I should implement it in Business Logic Layer as DAL multiple calls? I often found insufficient BLL operations that were implemented in dozen database roundtrips because developer was afraid to create a bit more sophisticated DAL. Thank you in advance!

    Read the article

  • MAAS. Adding nodes

    - by 2bus.erkanat
    Can anybody help me with MAAS installation? I installed it on 1 of my computers, configured DHCP server and now 3 PCs are in their own private network. I tried to add node via PXE image, but "Can not apply stage final, no datasource found! Likely bad things to come!" error comes after giving me option to log in. Then I tried to add node via web interface, but after adding it node's status is always "commisioning". So what should I do know? Any help is appreciated. Thank you P.S. English is not my native language, sorry for any type of errors.

    Read the article

  • Better way to set up samba bridge?

    - by Adam Butler
    I have an old ubuntu laptop hooked up to between my wireless network and a wired media player box. I had previously shared my wireless network connection so the media player had internet access (ie. via nat) because it was a different subnet it could not access the file shares on the wireless network. To get around this I mounted the drives from the wireless network on the laptop and re-shared them with samba. This worked ok but had some drawbacks, it seemed slow and if network computers were turned off when the laptop rebooted I had to manually mount the shares. I've just re-installed with the latest ubuntu and was wondering if there is a better way to do this. Is there some way to bridge so the media player appears to be on the wireless network? Would this give better performance? Any other options? I'm also thinking there might be some samba options that could buffer files?

    Read the article

  • Connection manager detects network but won't connect

    - by Carson Chittom
    I've just installed 12.04 on my home desktop. Because of where it's located, in order not to have my children tripping over cat5 cable, I've got a cheap 802.11n USB device plugged in, which uses a Realtek 8192CU chip. The house wifi is protected with WPA2. Ubuntu's Network Manager detects the network, but connecting to it just times out and asks for the password again. I'm sure I'm using the correct password. No other computers on the house network are having any difficulty. This device previously worked correctly with both Windows 7 and OpenBSD in the same machine. The linux-firmware package is installed (from the disc) at version 1.79, and /lib/firmware/rtlwifi/rtl8192cufw.bin is present. dmesg|grep rtl only shows the "loading firmware" message. I've tried Google, but I apparently can't find the right set of words to plug in, because I haven't found anything related (lots of "doesn't work at all," but nothing matching my circumstances). Any help would be greatly appreciated.

    Read the article

  • Can only connect to file server on second attempt

    - by Ross Fleming
    I have a FreeNas file server on my local network and I usually connect to it from Windows and Ubuntu computers. Ever since I have upgraded from Ubuntu 12.04 to 12.10 Ubuntu will only connect after a second attempt. By which I mean, I will browse to it via the file manager and once I click on the link in "Bookmarks" it complains that it could not connect. If I then try again it connects successfully and will keep up it's connection until the laptop is suspended or looses connection to the LAN for whatever reason. This isn't much of a problem as I don't mind having to click twice but my real problem is that this means that my scheduled backup will complain that it cannot connect to the storage device if it has not already been accessed during the current session. If there is some way to either stop the issue all together or to force the backup tool (default) to immediately have a second attempt at connecting.

    Read the article

  • How to import a pdf in libreoffice? under ubuntu, all pages are blank

    - by Daniele
    I have some .pdf generated by a scanner, that I want to import in LibreOffice and do some small editing. The PDF has only one object per page, a page-size image. If I open it in LibreOffice under Ubuntu 12.10, it imports "successfully" but all pages are blank. I have the libreoffice-pdfimport package installed. That is true with both LibreOffice 3.6 (part of Ubuntu 12.10) and with 4.0.2, from libreoffice ppa. The same .pdf files open perfectly fine on both LibreOffice for Windows and LibreOffice for Mac (yes, I have three computers with all three OSes), but on Ubuntu 12.10, all pages are blank, so I can only conclude this is an issue with Ubuntu packaging, or something really weird prevents it from working under linux. How can I import these kinds of .pdf into LibreOffice for editing?

    Read the article

  • How to deal with many to many relationships with NSFetchedResultsController?

    - by Phil Yates
    OK so I have two entities in my data model (let's say entityA and entityB), both of these entities have a to-many relationship to each other. I have setup a NSFetchedResultsController to fetch a bunch of entityA. Now I'm trying to have the section names for the tableview be the title of entityB. sectionNameKeyPath:@"entityB.title" Now this causes a problem, where by the section name returned from that relationship appears to be ({title1}) or ({title1,title2...titleN}) obviously depending on how many different entityB's are involved. This doesn't look great in a tableview and doesn't group the objects as I would like. What I would like is a section per entityB title with entityA appearing under each section, under multiple sections if necessary. I'm at a loss as how I am supposed to achieve this whether I need to update the predicate to get the entity to appear multiple times or whether I need to update the section and header functions to do some processing as the controller loops through the objects. Any help is appreciated :) Thanks

    Read the article

  • Why is chunk size often a power of two?

    - by danijar
    There are many Minecraft clones out there and I am working on my own implementation. A principle of terrain rendering is tiling the whole world in fixed size chunks to reduce the effort of localized changes. In Minecraft the chunk size is 16 x 16 x 256 as far as I now. And in clones I also always saw chunk sizes of a power of the number 2. Is there any reason for that, maybe performance or memory related? I know that powers of 2 play a special role in binary computers but what has that to do with the chunk size?

    Read the article

  • What You Said: Giving an Old Laptop a New Life

    - by Jason Fitzpatrick
    Earlier this week we asked you to share your tips and tricks for breathing life into an old laptop, now we’re back to share your junk-bin sparing methods. Many of you worked to keep old laptops from getting scrapped by dusting them off and donating them. Mark writes: My acquaintances & friends give me their old computers when they buy a new one. So I disassemble, clean, install an opsys,and get internet working. I also upgrade memory, wireless, etc. from my parts bin. Then I give it to a poor person who needs a computer. Usually a single working mom with kids. I also do the same with old desktops as well. They really appreciate them and It gives me the satisfaction of resurrecting an old computer. Wbrown does the same: How To Delete, Move, or Rename Locked Files in Windows HTG Explains: Why Screen Savers Are No Longer Necessary 6 Ways Windows 8 Is More Secure Than Windows 7

    Read the article

  • 'Invalid or corrupt kernel image' error while booting from USB stick

    - by steve
    I got the Ubuntu 11.10 32-bit edition and i used Unetbootin to write the ISO to a usb stick.After that i tried to boot from the stick,having changed the BIOS boot settings,and when the first interface shows up with the choices of istallation,whatever choice i select i got the message"invalid or corrupt kernel image".I use a netbook with windows 7 on it.I tried different usb sticks but same again.I also tried Universal USB installer instead of Unetbootin but same again.Any idea of what happens?I am using ASUS 1000H.The USB doesn't work in one more PC i tried to,but i also tried a second USB to both of the two computers and same again. I downloaded the ISO image once again and follow the same procedure but same again

    Read the article

  • Is this a correct Interlocked synchronization design?

    - by Dan Bryant
    I have a system that takes Samples. I have multiple client threads in the application that are interested in these Samples, but the actual process of taking a Sample can only occur in one context. It's fast enough that it's okay for it to block the calling process until Sampling is done, but slow enough that I don't want multiple threads piling up requests. I came up with this design (stripped down to minimal details): public class Sample { private static Sample _lastSample; private static int _isSampling; public static Sample TakeSample(AutomationManager automation) { //Only start sampling if not already sampling in some other context if (Interlocked.CompareExchange(ref _isSampling, 0, 1) == 0) { try { Sample sample = new Sample(); sample.PerformSampling(automation); _lastSample = sample; } finally { //We're done sampling _isSampling = 0; } } return _lastSample; } private void PerformSampling(AutomationManager automation) { //Lots of stuff going on that shouldn't be run in more than one context at the same time } } Is this safe for use in the scenario I described?

    Read the article

  • Network sharing for android

    - by Shagun
    I am on ubuntu 12.04 and I want to create a wifi hotspot to be used with my android device. Now I know there are so many tutorials available every where and that I have all options in network tab to use but I couldn't get it to work. Android does not work with an adhoc connection and whatever wifi network I created, my android device could not connect to it (I could connect a phone using bada to it and other computers can also be connected) I know the work around to get android on a adhoc connection but can't I have some thing as simple as connectify for windows? PS : I am not looking for workarounds involving android.

    Read the article

  • Wheres my memory going?

    - by Stu2000
    My machine keeps 'freezing' before eventaully logging out with all the programs exiting. This is rather annoying, and I think its because I keep running out of memory. I am not running any custom software, just netbeans, chrome etc. (Stuff I usually run on other ubuntu computers without issue). For some reason my memory usage is through the roof as seen here, but I can't quite figure out why. Here is a screenshot which may be useful with htop and gnome-system monitor open as user and as root. I notice that my console-kit-daemon is taking up about a gig of 'virtual memory'. Is that normal? Any tips/advice will be helpful. In the meantime I have ordered 2 x 4 gig ram sticks to try and just throw hardware at the issue.

    Read the article

  • Linking Excel and Access

    - by Mel
    I run a sports program where i have a master roll of who is in which class in excel. I want to link this to a database in access that stores the other information about each athlete, e.g. address, parents name, school, medical details. I want to be able to add names to class in the excel speadsheet and have this automatically generate a record for that person in access. There also needs to be some failsafe for athletes that are in multiple classes. I was also doing class roles as pivot tables out of the access database so i need to code for classes and also have this allow for athletes in multiple classes/disciplines.

    Read the article

  • Safe HttpContext.Current.Cache Usage

    - by Burak SARICA
    Hello there, I use Cache in a web service method like this : var pblDataList = (List<blabla>)HttpContext.Current.Cache.Get("pblDataList"); if (pblDataList == null) { var PBLData = dc.ExecuteQuery<blabla>( @"SELECT blabla"); pblDataList = PBLData.ToList(); HttpContext.Current.Cache.Add("pblDataList", pblDataList, null, DateTime.Now.Add(new TimeSpan(0, 0, 15)), Cache.NoSlidingExpiration, CacheItemPriority.Normal, null); } I wonder is it thread safe? I mean the method is called by multiple requesters And more then one requester may hit the second line at the same time while the cache is empty. So all of these requesters will retrieve the data and add to cache. The query takes 5-8 seconds. May a surrounding lock statement around this code prevent that action? (I know multiple queries will not cause error but i want to be sure running just one query.)

    Read the article

  • Using Entity Framework, how do I specify a sort on a navagation property?

    - by Jared
    I have two tables: [Category], [Item]. They are connected by a join table: [CategoryAndItem]. It has two primary key fields: [CategoryKey], [ItemKey]. Foreign keys exist appropriately and Entity has no problem pulling this in and creating the correct navigation properties that connect the entity objects. Basically each category can have multiple items, and items can be in multiple categories. The problem is that the order of items is specified per category, so that a particular item might be third in one category but fifth in another. In the past, I have added a [Sequence] field to the join table and modified the stored procedure to handle it. But since Entity is replacing my stored procedures, I need to figure out how to make Entity handle the sequence. Any suggestions?

    Read the article

  • Will Apple abandon OpenCL?

    - by John
    I am developing OpenCL applications, amongst others for MacOS. The new Macbook pro 13 inch comes with an Intel HD Graphics 3000 card so it seams reasonable to assume all their mainstream computers like Macbook and Mac mini will also come out with this Intel graphics card soon. OpenCL is not available for Intel graphics cards. Intel having a terrible reputation in developing graphics drivers and Apple knowing this makes me wonder Apple is abandoning OpenCL already again. Especially considering OpenCL should run anywhere, not only on high end systems. Developing applications only for the high end Macs with dedicated graphics hardware or for previous generation hardware with the Geforce 320M would not be a feasible option for me. Does anybody have any thoughts on this?

    Read the article

  • My rhythm game runs choppy even with high frame rate

    - by felipedrl
    I'm coding a rhythm game and the game runs smoothly with uncapped fps. But when I try to cap it around 60 the game updates in little chunks, like hiccups, as if it was skipping frames or at a very low frame rate. The reason I need to cap frame rate is because in some computers I tested, the fps varies a lot (from ~80 - ~250 fps) and those drops are noticeable and degrade response time. Since this is a rhythm game this is very important. This issue is driving me crazy. I've spent a few weeks already on it and still can't figure out the problem. I hope someone more experienced than me could shed some light on it. I'll try to put here all the hints I've tried along with two pseudo codes for game loops I tried, so I apologize if this post gets too lengthy. 1st GameLoop: const uint UPDATE_SKIP = 1000 / 60; uint nextGameTick = SDL_GetTicks(); while(isNotDone) { // only false when a QUIT event is generated! if (processEvents()) { if (SDL_GetTicks() > nextGameTick) { update(UPDATE_SKIP); render(); nextGameTick += UPDATE_SKIP; } } } 2nd Game Loop: const uint UPDATE_SKIP = 1000 / 60; while (isNotDone) { LARGE_INTEGER startTime; QueryPerformanceCounter(&startTime); // process events will return false in case of a QUIT event processed if (processEvents()) { update(frameTime); render(); } LARGE_INTEGER endTime; do { QueryPerformanceCounter(&endTime); frameTime = static_cast<uint>((endTime.QuadPart - startTime.QuadPart) * 1000.0 / frequency.QuadPart); } while (frameTime < UPDATE_SKIP); } [1] At first I thought it was a timer resolution problem. I was using SDL_GetTicks, but even when I switched to QueryPerformanceCounter, supposedly less granular, I saw no difference. [2] Then I thought it could be due to a rounding error in my position computation and since game updates are smaller in high FPS that would be less noticeable. Indeed there is an small error, but from my tests I realized that it is not enough to produce the position jumps I'm getting. Also, another intriguing factor is that if I enable vsync I'll get smooth updates @60fps regardless frame cap code. So why not rely on vsync? Because some computers can force a disable on gfx card config. [3] I started printing the maximum and minimum frame time measured in 1sec span, in the hope that every a few frames one would take a long time but still not enough to drop my fps computation. It turns out that, with frame cap code I always get frame times in the range of [16, 18]ms, and still, the game "does not moves like jagger". [4] My process' priority is set to HIGH (Windows doesn't allow me to set REALTIME for some reason). As far as I know there is only one thread running along with the game (a sound callback, which I really don't have access to it). I'm using AudiereLib. I then disabled Audiere by removing it from the project and still got the issue. Maybe there are some others threads running and one of them is taking too long to come back right in between when I measured frame times, I don't know. Is there a way to know which threads are attached to my process? [5] There are some dynamic data being created during game run. But It is a little bit hard to remove it to test. Maybe I'll have to try harder this one. Well, as I told you I really don't know what to try next. Anything, I mean, anything would be of great help. What bugs me more is why at 60fps & vsync enabled I get an smooth update and at 60fps & no vsync I don't. Is there a way to implement software vsync? I mean, query display sync info? Thanks in advance. I appreciate the ones that got this far and yet again I apologize for the long post. Best Regards from a fellow coder.

    Read the article

  • Dealing w/ Sqlite Join results in a cursor

    - by Bill
    I have a one-many relationship in my local Sqlite db. Pretty basic stuff. When I do my left outer join I get back results that look like this: the resulting cursor has multiple rows that look like this: A1.id | A1.column1 | A1.column2 | B1.a_id_fk | B1.column1 | B1.column2 A1.id | A1.column1 | A1.column2 | B2.a_id_fk | B2.column1 | B2.column2 and so on... Is there a standard practice or method of dealing with results like this ? Clearly there is only A1, but it has many B-n relationships. I am coming close to using multiple queries instead of the "relational db way". Hopefully I am just not aware of the better way to do things. I intend to expose this query via a content provider and I would hate for all of the consumers to have to write the same aggregation logic.

    Read the article

  • Almost Realtime Data and Web application

    - by Chris G.
    I have a computer that is recording 100 different data points into an OPC server. I've written a simple OPC client that can read all of this data. I have a front-end website on a different network that I would like to consume this data. I could easily set the OPC client to send the data to a SQL server and the website could read from it, but that would be a lot of writes. If I wanted the data to be updated every 10 seconds I'd be writing to the database every 10 seconds. (I could probably just serialize the 100 points to get 1 write / 10 seconds but that would also limit my ability to search the data later). This solution wouldn't scale very well. If I had 100 of these computers the situation would quickly grow out of hand. Obviously I am well out of my league here and I have no experience with working with a large amount of data like this. What are my options and what should I research?

    Read the article

  • Send a "304 Not Modified" for images stored in the datastore

    - by Emilien
    I store user-uploaded images in the Google App Engine datastore as db.Blob, as proposed in the docs. I then serve those images on /images/<id>.jpg. The server always sends a 200 OK response, which means that the browser has to download the same image multiple time (== slower) and that the server has to send the same image multiple times (== more expensive). As most of those images will likely never change, I'd like to be able to send a 304 Not Modified response. I am thinking about calculating some kind of hash of the picture when the user uploads it, and then use this to know if the user already has this image (maybe send the hash as an Etag?) I have found this answer and this answer that explain the logic pretty well, but I have 2 questions: Is it possible to send an Etag in Google App Engine? Has anyone implemented such logic, and/or is there any code snippet available?

    Read the article

  • How do I get my Broadcom BCM4313 working correctly?

    - by Ataraxio Panzetta
    I've installed ubuntu on a Asus 1015 netbook. Everything worked out of the box except for the Wireless adapter, which i had to install with the Additional Drivers application. It apparently installed fine and connects to our wireless network, but it only works at a "funny" speed range that goes from 367Bytes to a peak of 3Kb in its best moments. I know for sure the problem is neither the network nor the hardware. Network speeds are normal under windows on this laptop and in other computers with ubuntu aswell. lspci says the card is a BCM4313 model, but the Addittional Drivers Manager says these packecege contains Broadcom 802.11 Linux STA wireless driverfor use with Broadcom's BCM4311-, BCM4312-, BCM4321-, and BCM4322 based hardware seems like it installed the wrong driver... Is there anything I can do? I'm not concerned about compiling the driver or stuff like that, but I'm not sure on where to start... any help or guidance will be very, very appreciated.

    Read the article

  • restful way for deleting a bunch of items

    - by keisimone
    In wiki article for REST it is indicated that if you use http://example.com/resources DELETE, that means you are deleting the entire collection. If you use http://example.com/resources/7HOU57Y DELETE, that means you are deleting that element. I am doing a WEBSITE, note NOT WEB SERVICE. I have a list that has 1 checkbox for each item on the list. Once i select multiple items for deletion, i will allow users to press a button called DELETE SELECTION. If user presses the button, a js dialog box will popup asking user to confirm the deletion. if user confirms, all the items are deleted. So how should i cater for deleting multiple items in a RESTFUL way? NOTE, currently for DELETE in a webpage, what i do is i use FORM tag with POST as action but include a _method with the value DELETE since this is what was indicated by others in SO on how to do RESTful delete for webpage.

    Read the article

  • Install package with dependencies offline

    - by ArtemStorozhuk
    Right now I have 2 computers: Has connection to the internet and has installed package A. Doesn't have connection to the WEB. On this PC I need to install package A. I decided to download all needed packages using first PC and transfer them to the second PC via USB. I have searched how to get all needed packages for some deb installation and here's what I've found. But when I run: apt-get --print-uris --yes install A | grep ^\' | cut -d\' -f2 > downloads.list on first PC I got empty file because this package is already installed there (and I don't want to uninstall it). Also package A is very complicated and depends on package B which depends on package C and package C is not installed on the second PC. So how can I download all needed packages? Or is there any other way of installing it? Thanks for the help.

    Read the article

< Previous Page | 398 399 400 401 402 403 404 405 406 407 408 409  | Next Page >