Daily Archives

Articles indexed Thursday May 29 2014

Page 5/15 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • IX eCommerce Forum: Oracle ed Euronics presentano il loro caso di successo

    - by Claudia Caramelli-Oracle
    Promosso da Netcomm, l'evento ha raggiunto la nona edizione. La tematica principale permette di indagare le dinamiche di tutta la filiera del commercio elettronico, offrendo spunti utili grazie al coinvolgimento di ospiti illustri e relatori. L'e-Commerce Forum è il luogo ideale per scoprire le opportunità del mercato italiano.Oracle, insieme a Reply, ha organizzato un workshop lunch rivolto a tutti coloro che sono interessati a sentire storie di successo circa come la piattaforma eCommerce di Oracle è stata implementata con successo. Il testimonial in questa occasione è stato Euronics. Abbiamo avuto in sala quasi 40 persone che hanno trascorso la loro pausa pranzo con noi! La tematica del resto è attuale e in continua evoluzione/espansione: l'interesse è alto e Oracle offre i mezzi più all'avanguardia per costruire la propria storia di successo proiettando le altre realtà sempre più avanti nel commercio elettronico.Per maggiori informazioni scrivi a Silvia Valgoi

    Read the article

  • PARTNER WEBCAST (June 4): Enhance Customer experience with Nimble Storage SmartStack for Oracle with Cisco

    - by Zeynep Koch
    Live Webcast: Enhance Customer experience with Nimble Storage SmartStack for Oracle with Cisco A webcast for resellers who sell Oracle workloads to customers  Wednesday, June 4, 2014, 8:00 AM PDT /11 AM EDT  Register today Nimble Storage SmartStack™ for Oracle provides pre-validated reference architecture that speed deployments and minimize risk.  IT and Oracle administrators and architects realize the importance of underlying Operating System, Virtualization software, and Storage in maintaining services levels and staying in budget.  In this webinar, you will learn how Nimble Storage SmartStack for Oracle provides a converged infrastructure for Oracle database online transaction processing (OLTP) and online analytical processing (OLAP) environments with Oracle Linux and Oracle VM. SmartStack delivers the performance and reliability needed for deploying Oracle on a single symmetric multiprocessing (SMP) server or if you are running Oracle Real Application Clusters (RAC) on multiple nodes. Nimble Storage SmartStack for Oracle with Cisco can help you provide: Improved Oracle performance Stress-free data protection and DR of your Oracle database Higher availability and uptime Accelerate Oracle development and improve testing All for dramatically less than what you’re paying now Presenters: Doan Nguyen, Senior Principal Product Marketing Director, Oracle Vanessa Scott , Business Development Manager, Cisco Ibrahim “Ibby” Rahmani, Product and Solutions Marketing, Nimble Storage Join this event to learn from our Nimble Storage and Oracle experts on how to optimize your customers' Oracle environments. Register today to learn more!

    Read the article

  • PARTNER WEBCAST (June 4): Enhance Customer experience with Nimble Storage SmartStack for Oracle with Cisco

    - by Zeynep Koch
    Live Webcast: Enhance Customer experience with Nimble Storage SmartStack for Oracle with Cisco A webcast for resellers who sell Oracle workloads to customers  Wednesday, June 4, 2014, 8:00 AM PDT /11 AM EDT  Register today Nimble Storage SmartStack™ for Oracle provides pre-validated reference architecture that speed deployments and minimize risk.  IT and Oracle administrators and architects realize the importance of underlying Operating System, Virtualization software, and Storage in maintaining services levels and staying in budget.  In this webinar, you will learn how Nimble Storage SmartStack for Oracle provides a converged infrastructure for Oracle database online transaction processing (OLTP) and online analytical processing (OLAP) environments with Oracle Linux and Oracle VM. SmartStack delivers the performance and reliability needed for deploying Oracle on a single symmetric multiprocessing (SMP) server or if you are running Oracle Real Application Clusters (RAC) on multiple nodes. Nimble Storage SmartStack for Oracle with Cisco can help you provide: Improved Oracle performance Stress-free data protection and DR of your Oracle database Higher availability and uptime Accelerate Oracle development and improve testing All for dramatically less than what you’re paying now Presenters: Doan Nguyen, Senior Principal Product Marketing Director, Oracle Vanessa Scott , Business Development Manager, Cisco Ibrahim “Ibby” Rahmani, Product and Solutions Marketing, Nimble Storage Join this event to learn from our Nimble Storage and Oracle experts on how to optimize your customers' Oracle environments. Register today to learn more!

    Read the article

  • Success Quote: A Hybrid Approach for Success

    - by Lauren Clark
    We recently received this quote from a project that successfully used OUM: “On our project, we applied a combination of the Oracle Unified Method (OUM) and the client's methodology. The project was organized by OUM's phases and a subset of OUM's processes, tasks, and templates. Using a hybrid of the two methods resulted in an implementation approach that was optimized for the client-specific requirements for this project." This hybrid approach is an excellent example of using OUM in the flexible and scalable manner in which it was intended. The project team was able to scale OUM to be fit-for-purpose for their given situation. It's great to see how merging what was needed out of OUM with the client’s methodology resulted in an implementation approach that more closely aligned to the business needs. Successfully scaling OUM is dependent on the needs of the particular project and/or engagement. The key is to use no more than is necessary to satisfy the requirements of the implementation and appropriately address risks. For more information, check out the "Tailoring OUM for Your Project" page, which can be accessed by first clicking on the "OUM should be scaled to fit your implementation" link on the OUM homepage and then drilling into the link on the subsequent page. Have you used OUM in conjunction with a partner or customer methodology? Please share your experiences with us.

    Read the article

  • Register Now! Oracle 'In Touch' PartnerCast: Be prepared for a year of growth

    - by Julien Haye
    Dear Oracle partners, We would like to invite you to join David Callaghan, Senior Vice President Oracle EMEA Alliances and Channels, and his studio guests for the next broadcast of the ‘In Touch’ PartnerCast on Tuesday 1st July 2014 from 10:30am UK/ 11:30 CET. In this cast, David’s studio guests and his regional reporters will be looking at your priorities as EMEA partners and how best to grow with Oracle. We also look forward to the the broadcast covering the following hot topics: Highlights of FY14 Strategic themes for FY15 SaaS - HCM, CRM, ERP Oracle on Oracle Exclusive for ‘In Touch’ David Callaghan questions Rich Geraffo, Senior Vice President, Global Alliances & Channels, on how the FY15 Global partner kick off relates to EMEA. Plus David provides your chance to hear from some of the newly appointed Oracle Worldwide A&C Leadership team as he discusses with Bruce Chumley VP Oracle Channel Distribution Sales & Troy Richardson VP Oracle Strategic Alliances; their core focus and strategy of growth and what they intend on bringing to the table in their new role. You can now register for the cast here: With lots of studio guests joining David, why not get in touch on Twitter using the hashtag #OracleInTouch or by emailing [email protected] to get your questions featured in the cast! To find out more information and to watch previous episodes on-demand, please visit our webpage here. Best regards, Oracle EMEA Alliances & Channels

    Read the article

  • Extract and convert all Excel worksheets into CSV files using PowerShell

    Can PowerShell provide an easy way to export Excel as a CSV? Yes. Tim Smith demonstrates that whether you have multiple Excel files, or just multiple worksheets in Excel, PowerShell simplifies the process. Get to grips with SQL Server replicationIn this new eBook Sebastian Meine gives a hands-on introduction to SQL Server replication, including implementation and security. Download free ebook now.

    Read the article

  • The updated Survey pattern for Power Pivot and Tabular #powerpivot #tabular #ssas #dax

    - by Marco Russo (SQLBI)
    One of the first models I created for the many-to-many revolution white paper was the Survey one. At the time, it was in Analysis Services Multidimensional, and then we implemented it in Analysis Services Tabular and in Power Pivot, using the DAX language. I recently reviewed the data model and published it in the Survey article on DAX Patterns site. The Survey pattern is the foundation for others, such as the Basket Analysis, and it is widely used in many different business scenario. I was particularly happy to know it has been using to perform data analysis for cancer research! In this article I did some maintenance on the DAX formulas, checking that the proper error handling is part of the formulas, and highlighting some differences in slicers behavior between Excel 2010 and Excel 2013, which could be particularly important for the Survey scenario. As usual, we provide sample workbooks for both Excel 2010 and Excel 2013, and we use DAX Formatter to make the DAX code easier to read. Any feedback will be appreciated!

    Read the article

  • Detecting Hyper-Threading state

    - by jchang
    To interpret performance counters and execution statistics correctly, it is necessary to know state of Hyper-Threading. In principle, at low overall CPU utilization, for non-parallel execution plans, it should not matter whether HT is enabled or not. Of course, DBA life is never that simple. The state of HT does matter at high over utilization and in parallel execution plans depending on the DOP. SQL Server does seem to try to allocate threads on distinct physical cores at intermediate DOP (DOP less...(read more)

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario   Conventional Structures   Columnstore   Δ SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • TechEd 2014 Day 3

    - by John Paul Cook
    There is some confusion about durability of data stored in SQL Server in-memory tables, so some review of the concepts is appropriate. The in-memory option is enabled at the database level. Enabling it at the database level only gives you the option to specify the in-memory feature on a table by table basis. No existing tables or new tables will by default become in-memory tables when you enable the feature at the database level. If you choose to make a table an in-memory table, by default it is...(read more)

    Read the article

  • Single IBAction for multiple UIButtons versus single IBAction for single UIButton

    - by Miraaj
    While using story-board there are two different approaches which my team mates follow: Approach 1: To bind unique action with each button, ie: Done button - binded to - doneButtonAction Cancel button - binded to - cancelButtonAction OR Approach 2: To bind single action to multiple buttons, ie: Done button - binded to - commonButtonAction Cancel button - binded to - commonButtonAction Then in commonButtonAction they prefer to use switch case like this: - (IBAction)commonButtonAction:(id)sender { UIButton *button = (UIButton *)sender; switch (button.tag) { case 201: // done button [self doneButtonAction:sender]; break; case 202: // cancel button [self cancelButtonAction:sender]; break; default: break; } } - (void)cancelButtonAction:(id)sender { // no interesting stuff, simple dismiss of view :-( } - (void)doneButtonAction:(id)sender { // some interesting stuff ;-) } Reasoning which they give to follow approach 2 is - in each view controller during code walk through anyone can easily identify where to find code related to button actions. While others discard this idea because they say that adding an extra switch case is unnecessary and is not a common practice. What are your views?

    Read the article

  • Android -> Ruby Server Interface -> Mongodb

    - by MRabRabbit
    I've been wrecking my brain about this for a few days. I'll run my scenario by you and hopefully you can help me. In my head this is how it goes: I have an Android App. I want my Android App to make (function) calls to a MongoDB database via a Ruby Interface on the Server. e.g. Android app sends a HTTP GET ? with the function name, let's say getFriends for this user Ruby Interface receives this request from the app, grabs a thread from a thread pool and calls the appropriate function call implemented in Ruby, to the Mongodb. Ruby Interface gets results from Mongodb and sends a HTML POST to the Android app. So that's how I think it works. I know about the ruby driver for mongo db, and interacting with the mongodb from ruby but, how do I make a ruby back end listen for incoming messages and should these messages be done through sockets or a http interface ala Net::http in ruby?

    Read the article

  • Preparing for interview questions involving scale

    - by Chaitanya
    I have over 6 years of software development experience. I have worked on multiple platforms, including mobile. However, I have not had a chance of working on scale-related issues. As a consequence, whenever someone asks a question involving a million inputs in an interview, I find myself out of depth. How do I prepare for such questions? Any books/resources to refer to? There are books for Java and Data Structures and Concurrency, but I don't know about any definitive ones to learn about scaling.

    Read the article

  • What algorithms would suit image colour summarization? [on hold]

    - by codecowboy
    I would like to analyse a set of hundreds of thousands of product images (clothing, electronic goods etc) and retrieve the dominant colours in each. I'm only interested in the top 3 or 4 colours. The aim is to achieve a degree of certainty that x image is mostly red or image y is mostly orange and blue. The images are likely to be colour jpegs of reasonable quality and approximately 100kb in size. I would like to use C# and the solution should run on a Linux server, preferably using open source libraries. What image processing algorithms or techniques might help me achieve this?

    Read the article

  • Practical mysql schema advice for eCommerce store - Products & Attributes

    - by Gravy
    I am currently planning my first eCommerce application (mySQL & Laravel Framework). I have various products, which all have different attributes. Describing products very simply, Some will have a manufacturer, some will not, some will have a diameter, others will have a width, height, depth and others will have a volume. Option 1: Create a master products table, and separate tables for specific product types (polymorphic relations). That way, I will not have any unnecessary null fields in the products table. Option 2: Create a products table, with all possible fields despite the fact that there will be a lot of null rows Option 3: Normalise so that each attribute type has it's own table. Option 4: Create an attributes table, as well as an attribute_values table with the value being varchar regardless of the actual data-type. The products table would have a many:many relationship with the attributes table. Option 5: Common attributes to all or most products put in the products table, and specific attributes to a particular category of product attached to the categories table. My thoughts are that I would like to be able to allow easy product filtering by these attributes and sorting. I would also want the frontend to be fast, less concern over the performance of the inserting and updating of product records. Im a bit overwhelmed with the vast implementation options, and cannot find a suitable answer in terms of the best method of approach. Could somebody point me in the right direction? In an ideal world, I would like to offer the following kind of functionality - http://www.glassesdirect.co.uk/products/ to my eCommerce store. As can be seen, in the sidebar, you can select an attribute the glasses to filter them. e.g. male / female or plastic / metal / titanium etc... Alternatively, should I just dump the mySql relational database idea and learn mongodb?

    Read the article

  • Database Context and Singleton injection with IoC

    - by zaitsman
    All of the below relates to a ASP.NET c# app. I have a Singleton Settings MemoryCache that reads values from database on first access and caches these, then invalidates them using SQL Service Broker message and re-reads as required. For the purposes of standard controllers, i create my Db Context in a request scope. However, this obviously means that i can't use the same context in the Settings Cache class, since that is a singleton and we have a scope collision. At the moment, i ended up with two db contexts - the Controllers get it via IoC container, whereas a Singleton just creates it's own. However, i am not satisfied with this approach (mostly due to the way i feel about two contexts, the cache doesn't set anything on the db hence concurrency is not an issue as much). What is a better way to do it?

    Read the article

  • Are there any concerns with using a static read-only unit of work so that it behaves like a cache?

    - by Rowan Freeman
    Related question: How do I cache data that rarely changes? I'm making an ASP.NET MVC4 application. On every request the security details about the user will need to be checked with the area/controller/action that they are accessing to see if they are allowed to view it. The security information is stored in the database. For example: User Permission UserPermission Action ActionPermission A "Permission" is a token that is applied to an MVC action to indicate that the token is required in order to access the action. Once a user is given the permission (via the UserPermission table) then they have the token and can therefore access the action. I've been looking in to how to cache this data (since it rarely changes) so that I'm only querying in-memory data and not hitting a database (which is a considerable performance hit at the moment). I've tried storing things in lists, using a caching provider but I either run in to problems or performance doesn't improve. One problem that I constantly run in to is that I'm using lazy loading and dynamic proxies with EntityFramework. This means that even if I ToList() everything and store them somewhere static, the relationships are never populated. For example, User.Permissions is an ICollection but it's always null. I don't want to Include() everything because I'm trying to keep things simple and generic (and easy to modify). One thing I know is that an EntityFramework DbContext is a unit of work that acts with 1st-level caching. That is, for the duration of the unit of work, everything that is accessed is cached in memory. I want to create a read-only DbContext that will exist indefinitely and will only be used to read about permission data. Upon testing this it worked perfectly; my page load times went from 200ms+ to 20ms. I can easily force the data to refresh at certain intervals or simply leave it to refresh when the application pool is recycled. Basically it will behave like a cache. Note that the rest of the application will interact with other contexts that exist per request as normal. Is there any disadvantage to this approach? Could I be doing something different?

    Read the article

  • Android design advice - services & broadcast receivers

    - by basudz
    I'm in the process of learning the Android SDK and creating some projects to get a grasp on the system. The current project I'm working with works just fine but I'd like to get some advice about other ways I can go about designing it. Here's what it needs to do. When a text message is received from a specific number, it should fire off a toast message that repeats at a certain interval for a specific duration. To make this work, I created an SMS BroadcastReceiver and checked the incoming messages for the number I'm looking for. If found, an IntentService would be started that would pull out the interval and duration from saved shared prefs. The IntentService would then fire off a broadcast. The BroadcastReceiver for this would catch it and use the AlarmManager to handle the toast message repetitions. This all works just fine, but I'm wondering if there's a cleaner or more efficient way of going about doing this? Any suggestions or advice?

    Read the article

  • Ubuntu 14.10 no GUI or term login

    - by Lito
    I have updated yesterday my Ubuntu 14.10 installation with apt-get dist-upgrade. I was working all afternoon and after that I have rebooted computer. Once done it, lightdm doesn't starts (only gnome logo) and I can not view any of Ctrl + Shift + [1-6] terminals (cursor is blinking). I have read a lot of posts with no success: nividia/intel conflicts (I have a laptop with an Intel graphic card) I have enabled nomodeset and tried all this options My computer boots to a black screen, what options do I have to fix it? I have repaired grub I can load recovery/livecd/windows and mount partitions and network without problem I have all packages and system updated Here my logs. X11 not shows any error or problem, is loading all needed drivers without problem. How can I raise the level of debug? Best regards.

    Read the article

  • How to turn a pdf into a text searchable pdf?

    - by don.joey
    I have a number of scanned documents in pdf and I want to be able to search them. How can I do that? Essentially I have to OCR the pdf and then blend the extracted text back into a new pdf. I have unsuccesfully tried pdfocr (which gives me this issue: https://github.com/gkovacs/pdfocr/issues/7) pdfsandwich (of which the software center says it is a poor package and I should not install it) Is there a software package I am unaware of? Or a script that does this?

    Read the article

  • boot problem in samsung

    - by user286366
    i have formatted my laptop completely and installed Ubuntu 14.4 after installation Ubuntu is not booting directly i have to put the pen drive(which has Ubuntu 14.4 live cd) every time on start up and press f4 which is for recovery after booting if take the pen drive then no problem its working normally my laptop is Samsung np370 with Intel i5 3rd gen processor with 6gb ram and 1tb hard disk please help me

    Read the article

  • Uninstalling ubuntu and installing windows [duplicate]

    - by user286430
    This question already has an answer here: How to remove Ubuntu and put Windows back on? 13 answers Recently built a computer and installed ubuntu 14.04 from USB. I struggle with using the interface, programs and terminal so I plan to put windows 7 ultimate iso onto a USB while in ubuntu. Then remove ubuntu and install windows 7. How do I go about doing this and what programs do I use?

    Read the article

  • How to create a script which opens the browser and logins to a particular Gmail account?

    - by TechGuru
    I have created a bash script login.sh. I want it to open the browser and login to my Gmail account. I tried using the following command to open the browser with www.gmail.com. xdg-open http://gmail.com It opens Gmail home page perfectly. But I don't know how to pass the username and password for login to Gmail from the bash script. Is it possible to open the browser and login to Gmail from a script?

    Read the article

  • Where has /lib/udev/keymap gone? How do I adjust keymaps in Trusty?

    - by dizpers
    I tried to use this tutorial to make scroll switch work on my Microsoft Natural Ergonomic Keyboard 4000. But I face following error: sudo: /lib/udev/keymap: command not found I have udev version 204-5ubuntu20.2 (the version found in Trusty) I noticed that this version doesn't include keymap tool. But I notices that greater udev versions (which available for Debian, for example) include this tool. Could somebody explain this diff for me? =) And what should I do in this case - install package from Debian repo?

    Read the article

  • PS2 Eyetoy Recording Quality

    - by Fire
    I have Ubuntu 12.04 LTS and a PS2 eyetoy "Namtai". Don't worry - this is not the sterotypical " how do I get my eyetoy working" question. My eyetoy works fine on Cheese, gucview and various other media software like VLC. However, It seems like I am capped by 25fps. If I recall, the eyetoy is much better than this (~60fps) but cannot find any way to fix this. The best program that I have have found is VLC because its advanced options allow you to change many settings but the framerate setting appears to have no effect. What software or settings can I use to take full advantage of my eyetoy? To give you the full information I wish to attach multiple eyetoys to the system and record from all of them. (Security software like motion and zoneminder, I couldn't get installed correctly on my system- so I haven't tried those yet). edit - I tried the same camera on a Windows system and the frame rate is much better in VLC compared to the Ubuntu system. Mind you with default settings in Windows VLC, the resolution isn't great. However, in Skype for example the resolution is amazing and the framerate is good. It seems there must be some settings I am missing somewhere because it doesn't appear to be a hardware problem...

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >