Search Results

Search found 15137 results on 606 pages for 'global state'.

Page 446/606 | < Previous Page | 442 443 444 445 446 447 448 449 450 451 452 453  | Next Page >

  • IPgallery banks on Solaris SPARC

    - by Frederic Pariente
    IPgallery is a global supplier of converged legacy and Next Generation Networks (NGN) products and solutions, including: core network components and cloud-based Value Added Services (VAS) for voice, video and data sessions. IPgallery enables network operators and service providers to offer advanced converged voice, chat, video/content services and rich unified social communications in a combined legacy (fixed/mobile), Over-the-Top (OTT) and Social Community (SC) environments for home and business customers. Technically speaking, this offer is a scalable and robust telco solution enabling operators to offer new services while controlling operating expenses (OPEX). In its solutions, IPgallery leverages the following Oracle components: Oracle Solaris, Netra T4 and SPARC T4 in order to provide a competitive and scalable solution without the price tag often associated with high-end systems. Oracle Solaris Binary Application Guarantee A unique feature of Oracle Solaris is the guaranteed binary compatibility between releases of the Solaris OS. That means, if a binary application runs on Solaris 2.6 or later, it will run on the latest release of Oracle Solaris.  IPgallery developed their application on Solaris 9 and Solaris 10 then runs it on Solaris 11, without any code modification or rebuild. The Solaris Binary Application Guarantee helps IPgallery protect their long-term investment in the development, training and maintenance of their applications. Oracle Solaris Image Packaging System (IPS) IPS is a new repository-based package management system that comes with Oracle Solaris 11. It provides a framework for complete software life-cycle management such as installation, upgrade and removal of software packages. IPgallery leverages this new packaging system in order to speed up and simplify software installation for the R&D and production environments. Notably, they use IPS to deliver Solaris Studio 12.3 packages as part of the rapid installation process of R&D environments, and during the production software deployment phase, they ensure software package integrity using the built-in verification feature. Solaris IPS thus improves IPgallery's time-to-market with a faster, more reliable software installation and deployment in production environments. Extreme Network Performance IPgallery saw a huge improvement in application performance both in CPU and I/O, when running on SPARC T4 architecture in compared to UltraSPARC T2 servers.  The same application (with the same activation environment) running on T2 consumes 40%-50% CPU, while it consumes only 10% of the CPU on T4. The testing environment comprised of: Softswitch (Call management), TappS (Telecom Application Server) and Billing Server running on same machine and initiating various services in capacity of 1000 CAPS (Call Attempts Per Second). In addition, tests showed a huge improvement in the performance of the TCP/IP stack, which reduces network layer processing and in the end Call Attempts latency. Finally, there is a huge improvement within the file system and disk I/O operations; they ran all tests with maximum logging capability and it didn't influence any benchmark values. "Due to the huge improvements in performance and capacity using the T4-1 architecture, IPgallery has engineered the solution with less hardware.  This means instead of deploying the solution on six T2-based machines, we will deploy on 2 redundant machines while utilizing Oracle Solaris Zones and Oracle VM for higher availability and virtualization" Shimon Lichter, VP R&D, IPgallery In conclusion, using the unique combination of Oracle Solaris and SPARC technologies, IPgallery is able to offer solutions with much lower TCO, while providing a higher level of service capacity, scalability and resiliency. This low-OPEX solution enables the operator, the end-customer, to deliver a high quality service while maintaining high profitability.

    Read the article

  • Do MORE with WebCenter

    - by Michael Snow
    WEBCAST THURSDAY!! 03/22/12 Do you need to lower costs? Raise Productivity? Foster Innovation? Improve Online Engagement? But you’re still stuck with Documentum? Step away from the ledge – there is hope – let us help you. Top 4 Content Imperatives · Lower Costs - Reduce labor, maintenance fees, storage and electrical consumption · Raise Productivity - Automation and integration, communication, findability · Foster Innovation - Enable collaboration, expertise location · Improve Online Engagement – enable user-driven, dynamic marketing initiatives With the coming technology wave we see four content imperatives. Every organization has had to reduce costs, cost cutting has become a way of life. Everyone is working three jobs as positions are eliminated. And so we have to reduce labor, reduce maintenance, and reduce money we are wasting on things like storing content that is redundant or no longer useful. We also, to fill that gap, need to raise productivity. Knowledge workers represent the fastest growing segment of the workforce, accounting for 40%-75% of the employees at organizations in sectors like financial services, life sciences, healthcare and retail.  What’s more, their wages total 18 percent of the United States GDP. And so we can’t afford information systems that don’t let our top performers be the best they can be. We look to automate the content processes, provide ways to integrate that content into our processes, provide communication to make decisions, and to make content more findable so people can make the right decision and move the process forward. And really to get ourselves out of the current financial status, we can only cut costs so far. We have to innovate out of economic tough times – to find new products and new markets. And to enable the innovation process, we have to enable collaboration and expertise location. So much of innovation is about building on innovations that have come before. To solve problems, we have to be able to find what our organization has already created. We find that problems we need to solve have already been solved if we can find the right document, the right person. So we have to provide systems that enable us to stand on the shoulders of our organization’s accomplishments. Good content drives great marketing. Online engagement is growing as an absolute necessity for modern growing marketing organizations that require the business users be enabled for dynamic marketing content creation, updates and targeted content creation and management. Unfortunately – if you are currently stuck with Documentum, you are really lacking in your Web Experience Management capabilities. Documentum previously used FatWire for web publishing. Now FatWire is part of Oracle. Oracle provides powerful web engagement capabilities: Increase sales and loyalty by optimizing online engagement Create, manage and moderate contextually relevant, targeted and interactive online experiences Optimize customer engagement across, web, mobile and social channels Manage large scale multichannel global online presence with integration to enterprise applications Enable business users to control their content and make their own updates Publish content from native files – enable navigation of project documents, procedures, policy information Enable content display and updates from existing web applications – one click to drag and drop content management functionality So you get the ability to self-publish information and make it navigable, to move the process of publishing from IT to business users, and the ability to address a whole new area of user engagement with web experience management. So… if you are still stuck with Documentum and don’t know what to do – contact us – not only will Oracle help you step away from the ledge, but also with the MoveOff Documentum program, we are offering you a way – trade-in your Documentum licenses for a 100% credit on Oracle WebCenter. How’s that for a nice bonus? It’s time to stop maintaining Documentum, and to start innovating with Oracle WebCenter. Learn More Here! To learn more about what Oracle WebCenter can offer you today – join us for a webcast – your eyes will be opened to all that’s possible. Do More with WebCenter: Extend Beyond Content Management

    Read the article

  • Boot Problem in Asus EEE PC 1015CX

    - by Sâmrat VikrãmAdityá
    I am a newbie to Linux world, although I have previously worked on Ubuntu 11.04 for daily use (Net Access and simple recordings using Audacity). I am not sure, at what level I stand as a newbie. I bought this Asus Eee PC two days back. The model is Asus 1015CX. See the specs here http://www.flipkart.com/asus-1015cx-blk011w-laptop-2nd-gen-atom-dual-core-1gb-320gb-linux/p/itmd8qu4quzu8srr . I created a live USB to install 12.10. The usb booted fine. When I clicked "Try Ubuntu" option, it showed me a black screen with a cursor blinking. I waited for 15 minutes and had to restart using the power button. On clicking the "Install Ubuntu" button, the install process went seamlessly. [I have a Windows7 installed on one of the partitions]. i installed it alongside previous windows installation. The system was then rebooted for the first time. It showed the GRUB menu and I selected the first option Ubuntu. After showing the splash screen for a second, it began showing various messages on a black screen and then it struck on "Stopping Save kernel state message". I had to force shut the system using power button. Sometimes it just gives a blank screen with a cursor blinking and on pressing power button, some messages stating that acpid is doing something and stopping services pops up and the system shuts down. I tried booting with "nomodeset" and other parameters as directed in solution to previous such problems on forums. Also Ctrl+Alt+F1,F2,F3,F4,F5,F6..F12 is not doing anything for me anywhere. At installation, I checked Login automatically option. On booting into recovery several options comes up. Clicking resume just gives me a blank screen with cursor blinking. on dropping to root shell and remounting filesystem as RW, I am able to supply some command that worked for others. startx -- Several messages comes up with last one stating Fatal error: No screen found sudo service lightdm start -- Gives a blank screen with a cursor blinking lspci | grep VGA -- Shows some Intel Integrated Graphic... something I don't remember I had reconfigured xserver-xorg, lightdm, reinstalled ubuntu-desktop, unity. What should I do..?? Will going back to 11.04 work..?? Or I should leave all hopes of running Ubuntu on my netbook. Please help.

    Read the article

  • Given the presentation model pattern, is the view, presentation model, or model responsible for adding child views to an existing view at runtime?

    - by Ryan Taylor
    I am building a Flex 4 based application using the presentation model design pattern. This application will have several different components to it as shown in the image below. The MainView and DashboardView will always be visible and they each have corresponding presentation models and models as necessary. These views are easily created by declaring their MXML in the application root. <s:HGroup width="100%" height="100%"> <MainView width="75% height="100%"/> <DashboardView width="25%" height="100%"/> </s:HGroup> There will also be many WidgetViewN views that can be added to the DashboardView by the user at runtime through a simple drop down list. This will need to be accomplished via ActionScript. The drop down list should always show what WidgetViewN has already been added to the DashboardView. Therefore some state about which WidgetViewN's have been created needs to be stored. Since the list of available WidgetViewN and which ones are added to the DashboardView also need to be accessible from other components in the system I think this needs to be stored in a Model object. My understanding of the presentation model design pattern is that the view is very lean. It contains as close to zero logic as is practical. The view communicates/binds to the presentation model which contains all the necessary view logic. The presentation model is effectively an abstract representation of the view which supports low coupling and eases testability. The presentation model may have one or more models injected in in order to display the necessary information. The models themselves contain no view logic whatsoever. So I have a several questions around this design. Who should be responsible for creating the WidgetViewN components and adding these to the DashboardView? Is this the responsibility of the DashboardView, DashboardPresentationModel, DashboardModel or something else entirely? It seems like the DashboardPresentationModel would be responsible for creating/adding/removing any child views from it's display but how do you do this without passing in the DashboardView to the DashboardPresentationModel? The list of available and visible WidgetViewN components needs to be accessible to a few other components as well. Is it okay for a reference to a WidgetViewN to be stored/referenced in a model? Are there any good examples of the presentation model pattern online in Flex that also include creating child views at runtime?

    Read the article

  • MySQL Enterprise Backup 3.8.2 - Overview

    - by Priya Jayakumar
      MySQL Enterprise Backup (MEB) is the ideal solution for backing up MySQL databases. MEB 3.8.2 is released in June 2013. MySQL Enterprise Backup 3.8.2 release’s main goal is to improve usability. With this release, users can know the progress of backup completed both in terms of size and as a percentage of the total. This release also offers options to be able to manage the behavior of MEB in case the space on the secondary storage is completely exhausted during backup. The progress indicator is a (short) string that indicates how far the execution of a time-consuming MEB command has progressed. It consists of one or more "meters" that measures the progress of the command. There are two options introduced to control the progress reporting function of mysqlbackup command (1) –show-progress (2) –progress-interval. The user can control the progress indicator by using “--show-progress” option in any of the MEB operations. This option instructs MEB to output periodically short reports on the progress of time-consuming commands. The argument of this option instructs where the output could be sent. For example it could be stderr, stdout, file, fifo and table. With the “--show-progress” option both the total size of the backup to be copied and the size that’s already copied will be shown. Along with this, the state of the operation for example data or meta-data being copied or tables being locked and other such operations will also be reported. This gives more clear information to the DBA on the progress of the backup that’s happening. Interval between progress report in seconds is controlled by “--progress-interval” option. For more information on this please refer progress-report-options. MEB can also be accessed through GUI from MySQL WorkBench’s next version. This can be used as the front end interface for MEB users to perform backup operations at the click of a button. This feature was highly requested by DBAs and will be very useful. Refer http://insidemysql.com/mysql-workbench-6-0-a-sneak-preview/ for WorkBench upcoming release info. Along with the progress report feature some of the important issues like below are also addressed in MEB 3.8.2. In MEB 3.8.2 a new command line option “--on-disk-full” is introduced to abort or warn the user when a backup process encounters a full disk condition. When no option is given, by default it would abort. A few issues related to “incremental-backup” are also addressed in this release. Please refer 3.8.2 documentation for more details. It would be good for MEB users to move to 3.8.2 to take incremental backups. Overall the added usability and the important defects fixed in this release makes MySQL Enterprise Backup 3.8.2 a promising release.  

    Read the article

  • How to install chrome autosave extension?

    - by Oguz Can Sertel
    I would like to install chrome autosave plugin on ubuntu. when I try to install it with these steps https://github.com/NV/chrome-devtools-autosave-server , I got some errors... there was not installed node and npm out of box on ubuntu 12.10. So I installed npm and node with these commands. sudo apt-get install npm sudo apt-get install node and I tried to install autosave here is the output: sudo npm install -g autosave npm http GET https://registry.npmjs.org/autosave npm http 304 https://registry.npmjs.org/autosave npm http GET https://registry.npmjs.org/commander npm http 304 https://registry.npmjs.org/commander /usr/local/bin/autosave -> /usr/local/lib/node_modules/autosave/bin/autosave > [email protected] install /usr/local/lib/node_modules/autosave > node ./scripts/install.js npm ERR! error installing [email protected] npm WARN This failure might be due to the use of legacy binary "node" npm WARN For further explanations, please read npm WARN /usr/share/doc/nodejs/README.Debian npm WARN npm ERR! [email protected] install: `node ./scripts/install.js` npm ERR! `sh "-c" "node ./scripts/install.js"` failed with 1 npm ERR! npm ERR! Failed at the [email protected] install script. npm ERR! This is most likely a problem with the autosave package, npm ERR! not with npm itself. npm ERR! Tell the author that this fails on your system: npm ERR! node ./scripts/install.js npm ERR! You can get their info via: npm ERR! npm owner ls autosave npm ERR! There is likely additional logging output above. npm ERR! npm ERR! System Linux 3.5.0-17-generic npm ERR! command "/usr/bin/nodejs" "/usr/bin/npm" "install" "-g" "autosave" npm ERR! cwd /home/naczu npm ERR! node -v v0.6.19 npm ERR! npm -v 1.1.4 npm ERR! code ELIFECYCLE npm ERR! message [email protected] install: `node ./scripts/install.js` npm ERR! message `sh "-c" "node ./scripts/install.js"` failed with 1 npm ERR! errno {} npm ERR! npm ERR! Additional logging details can be found in: npm ERR! /home/naczu/npm-debug.log npm not ok and here is README.debian nodejs for Debian ================= packaged modules ---------------- The global search path for modules is /usr/lib/nodejs Future packages of node modules will use that directory, so it should be used wisely. user modules ------------ Node looks for modules in ./node_modules directory first; please read node#modules documentation carefully for more information. Node does not look for modules in /usr/local/lib/node_modules, where npm put them. Please read npm-link(1) of npm package, to understand how to properly use npm-installed modules in a project. Note that require.paths is not supported in future node versions. See also node(1) for more information about NODE_PATH. nodejs command -------------- The upstream name for the Node.js interpreter command is "node". In Debian the interpreter command has been changed to "nodejs". This was done to prevent a namespace collision: other commands use the same name in their upstreams, such as ax25-node from the "node" package. Scripts calling Node.js as a shell command must be changed to instead use the "nodejs" command.

    Read the article

  • Duke's Choice Award Ceremony

    - by Tori Wieldt
    The 2012 Duke's Choice Awards winners and their creative, Java-based technologies and Java community contributions were honored after the Sunday night JavaOne keynotes. Sharat Chander, Group Director for Java Technology Outreach, presented the awards. "Having the community participate directly in both submission and selection truly shows how we are driving exposure of the innovation happening in the Java community," he said. Apache Software Foundation Hadoop Project The Apache Software Foundation’s Hadoop project, written in Java, provides a framework for distributed processing of big data sets across clusters of computers, ranging from a few servers to thousands of machines. This harnessing of large data pools allows organizations to better understand and improve their business. AgroSense Project Improving farming methods to feed a hungry world is the goal of AgroSense, an open source farm information management system built in Java and the NetBeans platform. AgroSense enables farmers, agribusinesses, suppliers and others to develop modular applications that will easily exchange information through a common underlying NetBeans framework. JDuchess Rather than focus on a specific geographic area like most Java User Groups (JUGs), JDuchess fosters the participation of women in the Java community worldwide. The group has more than 500 members in 60 countries, and provides a platform through which women can connect with each other and get involved in all aspects of the Java community. Jelastic, Inc. Moving existing Java applications to the cloud can be a daunting task, but startup Jelastic, Inc. offers the first all-Java platform-as-a-service (PaaS) that enables existing Java applications to be deployed in the cloud without code changes or lock-in. Liquid Robotics Robotics – Liquid Robotics is an ocean data services provider whose Wave Glider technology collects information from the world’s oceans for application in government, science and commercial applications. The organization features the “father of Java” James Gosling as its chief software architect. London Java Community The second user group receiving a Duke’s Choice Award this year, the London Java Community (LJC) and its users have been active in the OpenJDK, the Java Community Process (JCP) and other efforts within the global Java community. NATO The first-ever Community Choice Award goes to the MASE Integrated Console Environment (MICE) in use at NATO. Built in Java on the NetBeans platform, MICE provides a high-performance visualization environment for conducting air defense and battle-space operations. Parleys.com E-learning specialist Parleys.com, based in Brussels, Belgium, uses Java technologies to bring online classes and full IT conferences to desktops, laptops, tablets and mobile devices. Parleys.com has hosted more than 1,700 conferences—including Devoxx and JavaOne—for more than 800,000 unique visitors. Student Nokia Developer Group This year’s student winner, Ram Kashyap, is the founder and president of the Nokia Student Network, and was profiled in the “The New Java Developers” feature in the March/April 2012 issue of Java Magazine. Since then, Ram has maintained a hectic pace, graduating from the People’s Education Society Institute of Technology in Bangalore, India, while working on a Java mobile startup and training students on Java ME. United Nations High Commissioner for Refugees The United Nations High Commissioner for Refugees (UNHCR) is on the front lines of crises around the world, from civil wars to natural disasters. To help facilitate its mission of humanitarian relief, the UNHCR has developed a light-client Java application on the NetBeans platform. The Level One registration tool enables the UNHCR to collect information on the number of refugees and their water, food, housing, health, and other needs in the field, and combines that with geocoding information from various sources. This enables the UNHCR to deliver the appropriate kind and amount of assistance where it is needed. You can read more about the winners in the current issue of Java Magazine.

    Read the article

  • Speaking at Mix11

    - by Dennis Vroegop
    In April Microsoft will hold the next MIX event. MIX was usually targeted at web designers and developers but has grown over the years to be more a general conference focused on the web and devices. In other words: everything the normal consumer might encounter. It’s not your typical developers conference, although you’ll find many developers there as well. But next to the developers you’ll probably run into designers and user experience specialists as well. This year I am proud to say that I will be one of the people presenting there. Together with all the Surface MVP’s in the world (sounds impressive, but there are only 7 of us) we’ll host a panel discussion on all things Surface, NUI and everything else that matches those subjects. Here’s what the abstract says: The Natural User Interface (NUI) is a hot topic that generates a lot of excitement, but there are only a handful of companies doing real innovation with NUIs and most of the practical experience in the NUI style of design and development is limited to a small number of experts. The Microsoft Surface MVPs are a subset of these experts that have extensive real-world experience with Microsoft Surface and other NUI devices. This session is a panel featuring the Microsoft Surface MVPs and an unfiltered discussion with each other and the audience about the state of the art in NUI design and development. We will share our experiences and ideas, discuss what we think NUI will look like in the near future, and back up our statements with cutting-edge demonstrations prepared by the panelists involving combinations of Microsoft Surface 2.0, Kinect, and Windows Phone 7. We, as Surface MVPs think we are more than just Surface oriented. We like to think we are more NUI MVP’s. But since that’s not a technology with Microsoft you can’t actually become a NUI MVP so Surface is the one that comes the closest. We are currently working on the details of our session but believe me: it will blow you away. Several people we talked to have said this could potentially be the best session of Mix. Quite a challenge, but we’re up for it! Of course I won’t be telling you exactly what we’re going to do in Las Vegas but rest assured that when you visit our session you’ll leave with a lot of new ideas and hopefully be inspired to bring into practice what you’ve seen. Even if the technology we’ll show you isn’t readily available yet. So, if you are in Las Vegas between April 12th and 14th, please join Joshua Blake, Neil Roodyn, Rick Barraza, Bart Roozendaal, Josh Santangelo, Nicolas Calvi and myself for some NUI fun! See you in Vegas! Tags van Technorati: mix11,las vegas,surface,nui,kinecct

    Read the article

  • Enterprise Integration: Can Companies Afford It?

    - by Ralph Wheaton
    Each year, my company holds a global sales conference where employees and partners from around the world some together to collaborate, share knowledge and ideas and learn about future plans.  As a member of the professional services division, several of us had been asked to make a presentation, an elevator pitch in 3 minutes or less that relates to a success we have worked on or directly relates to our tag (that is, our primary technology focus).  Mine happens to be Enterprise Integration as it relates Business Intelligence.  I found it rather difficult to present that pitch in a short amount of time and had to pare it down.  At any rate, in just a little over 3 minutes, this is the presentation I submitted.  Here is a link to the full presentation video in WMV format.   Many companies today subscribe to a buy versus build mentality in an attempt to drive down costs and improve time to implementation. Sometimes this makes sense, especially as it relates to specialized software or software that performs a small number of tasks extremely well. However, if not carefully considered or planned out, this oftentimes leads to multiple disparate systems with silos of data or multiple versions of the same data. For instance, client data (contact information, addresses, phone numbers, opportunities, sales) stored in your CRM system may not play well with Accounts Receivables. Employee data may be stored across multiple systems such as HR, Time Entry and Payroll. Other data (such as member data) may not originate internally, but be provided by multiple outside sources in multiple formats. And to top it all off, some data may have to be manually entered into multiple systems to keep it all synchronized. When left to grow out of control like this, overall performance is lacking, stability is questionable and maintenance is frequent and costly. Worse yet, in many cases, this topology, this hodgepodge of data creates a reporting nightmare. Decision makers are forced to try to put together pieces of the puzzle attempting to find the information they need, wading through multiple systems to find what they think is the single version of the truth. More often than not, they find they are missing pieces, pieces that may be crucial to growing the business rather than closing the business. across applications. Master data owners are defined to establish single sources of data (such as the CRM system owns client data). Other systems subscribe to the master data and changes are replicated to subscribers as they are made. This can be one way (no changes are allowed on the subscriber systems) or bi-directional. But at all times, the master data owner is current or up to date. And all data, whether internal or external, use the same processes and methods to move data from one place to another, leveraging the same validations, lookups and transformations enterprise wide, eliminating inconsistencies and siloed data. Once implemented, an enterprise integration solution improves performance and stability by reducing the number of moving parts and eliminating inconsistent data. Overall maintenance costs are mitigated by reducing touch points or the number of places that require modification when a business rule is changed or another data element is added. Most importantly, however, now decision makers can easily extract and piece together the information they need to grow their business, improve customer satisfaction and so on. So, in implementing an enterprise integration solution, companies can position themselves for the future, allowing for easy transition to data marts, data warehousing and, ultimately, business intelligence. Along this path, companies can achieve growth in size, intelligence and complexity. Truly, the question is not can companies afford to implement an enterprise integration solution, but can they afford not to.   Ralph Wheaton Microsoft Certified Technology Specialist Microsoft Certified Professional Developer Microsoft VTS-P BizTalk, .Net

    Read the article

  • Crash Report in Ubuntu... hardware problem?

    - by Andrew
    Got this on my machine. I was just browsing the web on Chrome and my computer froze. I recently just built this machine. I have a feeling it is a hardware problem... Possibly one of my parts arrived broken in some way.... Starting anac(h)ronistic cron Stopping anac(h)ronistic cron Stopping cold plug devices Stopping log initial device creation Starting enable remaining boot-time encrypted block devices Starting configure network device security Starting configure virtual network devices Starting save udev log and update rules Stopping configure virtual network devices Stopping save udev log and update rules Checking battery state... Stopping System V runlevel compatibility Stopping enable remaining boot-time encrypted block devices Stopping Mount filesystems on boot 91.573384] BUG: unable to handle kernel NULL pointer dereference at (null) 91.573437] IP: [<ffffffff81313514>] strcmp+0x14/0x30 91.573470] PGD 1f7822067 PUD 1ed7a6067 PMD 0 91.573498] Oops: 0000 [#1] SMP 91.573519] CPU 3 91.573531] Modules linked in: dm_crypt bnep snd_hda_codec_realtek rfcomm bluetooth parport_pc ppdev arc4 fglrx(P) rt2800usb rt2800lib crc_ccitt rt2x00usb rt2x00lib mac0021 cfg80211 psmouse snd_hda_intel snd_hda_codec snd_hwdep snd_pcm snd_seq_midi snd_rawmidi snd_seq_midi_event snd_seq snd_timer send_seq_device snd joydev mac_hid mei(C) soundcore serio_raw snd_page_alloc lp parport ses enclosure usbhid hid i915 drm_kms_helper drm i2c_algo_bit mxm_umi tg_video wmi usb_storage 91.573826] 91.573837] Pid: 2297, comm: update-notifier Tainted: P C O 3.2.0-29-generic #46-Ubuntu To Be Filled By O.E.M. To Be Filled By O.E.M./Z77 Extreme4 91.573912] RIP: 0010:[<ffffffff81313514>] [<ffffffff81313514>] strcmp+0x14/0x30 91.573954] RSP: 0018:ffff8801f83f5bb8 EFLAGS: 00010246 91.573982] RAX: 0000000000000000 RBX: 0000000000000000 RCX: 0000000000000000 91.574019] RDX: 0000000000000069 RSI: 0000000000000000 RDI: ffff88021adb26f8 91.574056] RBP: ffff8801f83f5bb8 R08: ffff88022f2d6e80 R09: 0000000000000000 91.574093] R10: ffff88021e7dbf00 R11: 0000000000000003 R12: ffff88021c10eb40 91.574130] R13: 0000000000000000 R14: ffff88021adb26f8 R15: ffff8801f83f5d40 91.574168] FS: 00007f958cf53940(0000) GS:ffff88022f2c0000(0000) kn1GS:0000000000000000 91.574210] CS: 0010 DS: 0000 ES: 0000 CR0: 0000000080050033 91.574240] CR2: 0000000000000000 CR3: 000000021f6d7000 CR4: 00000000000406e0 91.574277] DR0: 0000000000000000 DR1: 0000000000000000 DR2: 0000000000000000 91.574314] DR3: 0000000000000000 DR6: 00000000ffff0ff0 DR7: 0000000000000000 91.574351] Process update-notifier (pid: 2297, threadinfo ffff801f83f4000, task ffff880208fe2e00) 91.574397] Stack: 91.574409] ffff8801f83f5be8 ffffffff811ed509 ffff88021adb26c0 ffff88021b8b7020 91.574453] ffff88021b461c60 fffffffffffffffe ffff8801f83f5c18 ffffffff811ed61f 91.574496] ffff88021adb26c0 ffff88021b8b7020 ffff8801f83f5dc8 0000000000000001 91.574539] Call Trace: 91.574558] [<ffffffff811ed509] sysfs_find_dirent+0x59/0x110 91.574591] [<ffffffff811ed61f] sysfs_lookup+0x5f/0x110 91.574621] [<ffffffff81182745] d_alloc_and_lookup+0x45/0x90 91.574654] [<ffffffff8118fe65] ? d_lookup+0x35/0x60 91.574683] [<ffffffff811848d2] do_lookup+0x202/0x310 91.574712] [<ffffffff8118660c] path_lookupat+0x11c/0x750 91.574744] [<ffffffff81318db7] ? __strncpy_from_user+0x27/0x60 91.574778] [<ffffffff81186c71] do_path_lookup+0x31/0xc0 91.574809] [<ffffffff81187779] user_path_at_empty+0x59/0xa0 91.574842] [<ffffffff81187822] ? do_filp_open+0x42/0xa0 91.574872] [<ffffffff811877d1] user_path_at+0x11/0x20 91.574902] [<ffffffff8117c80a] vfs_fstatat+0x3a/0x70 91.574933] [<ffffffff81161cff] ? kmem_cache_free+0x2f/0x110 91.574965] [<ffffffff8117c85e] vfs_lstat+-x31/0x70 91.574993] [<ffffffff8117c9fa] sys_newlstat+0x1a/0x40 91.575022] [<ffffffff81176ee1] ? do_sys_open+0x171/0x220 91.575053] [<ffffffff8117cb1a] ? sys_readlinkat+0x7a/0xb0 91.575086] [<ffffffff81661ec2] system_call_fastpath+0x16/0x1b 91.575118] Code: 83 c1 01 40 84 ff 75 ef 5d c3 66 66 66 66 2e 0f 1f 84 00 00 00 00 00 00 55 31 c0 48 89 e5 66 2e 0f 1f 84 00 00 00 00 00 0f b6 14 07 <3a> 14 06 75 0f 48 83 c0 01 84 d2 75 ef 31 c0 5d c3 0f 1f 00 19 91.577243] RIP [<ffffffff81313514>] strcmp+0x14/0x30 91.579314] RSP <ffff8801f83f5bb8> 91.581385] CR2: 0000000000000000

    Read the article

  • Implementing Search for BlogReader Windows 8 Sample

    - by Harish Ranganathan
    The BlogReader sample is an excellent place to start speeding up your Windows 8 development skills.  The tutorial is available here and the complete source code is available here Create a project called WindowsBlogReader and create pages for ItemsPage.xaml, SplitPage.xaml and DetailPage.xaml and copy the corresponding code blocks from the sample listed above. Created a class file FeedData.cs and copy the code.  Finally, create a class DateConverter.cs and copy the code associated with it. With that you should be able to build and run the project.  There seems to be one issue in the sample feeds listed that the first week (feed1) doesn’t seem to expose it.  So you can skip that and use the second feed as first feed.  You will end up with one feed less but it works. I had demonstrated this in the recent TechDays at Chennai.  How we can use the Search Contract and implement Search for within the Blog Titles. First off, we need to declare that the App will be using Search Contract, in the Package.appmanifest file Next, we would need a handle of the Search Contract when user types on the search window in Charms Menu. If you had completed the code sample from the link above, you would have ItemsPage.xaml and ItemsPage.xaml.cs.  Open the ItemsPage.xaml.cs. Import the namespaces using System.Collections.ObjectModel and System.Linq. in the ItemsPage() constructor, right after this.InitializeComponent(); add the following code Windows.ApplicationModel.Search.SearchPane.GetForCurrentView().QuerySubmitted += ItemsPage_QuerySubmitted; This event is fired when users open up the Search Panel from Charms Menu, type something and hit enter. We need to handle this event declared in the delegate.  For that we need to pull the FeedDataSource instantiation to the root of the class to make it global. So, add the following as the first line within the partial class FeedDataSource feedDataSource; Also, modify the LoadState method, as follows:- protected override void LoadState(Object navigationParameter, Dictionary<String, Object> pageState)        {            feedDataSource = (FeedDataSource)App.Current.Resources["feedDataSource"];            if (feedDataSource != null)            {                this.DefaultViewModel["Items"] = feedDataSource.Feeds;            }        } Next is to implement the ItemsPage_QuerySubmitted method void ItemsPage_QuerySubmitted(Windows.ApplicationModel.Search.SearchPane sender, Windows.ApplicationModel.Search.SearchPaneQuerySubmittedEventArgs args)         {             this.DefaultViewModel["Items"] = from dynamic item in feedDataSource.Feeds                                              where                                              item.Title.Contains(args.QueryText)                                              select item;         } As you can see we are almost using the same defaultviewmodel with the change that we are using a linq query to do a search on feeds which has the Title that matches QueryText. With this we are ready to run the app. Run the App.  Hit the Charms Menu with Windows + C key combination and type a text to search within the blog. You can see that it filters the Blogs which has the matching text. We can modify the above Linq query to do a search for the Text in other attributes like description, actual blog content etc., I have uploaded the complete code since the original WindowsBlogReader Code is not available for download.  You can download it from here note:  this code is provided as-is without any warranties.  Cheers!!!

    Read the article

  • Unlocking High Performance with Policy Administration Replacement

    - by helen.pitts(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-ansi-language:EN-CA; mso-fareast-language:EN-CA;} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-ansi-language:EN-CA; mso-fareast-language:EN-CA;} It is clear the insurance industry is undergoing significant changes as it consolidates and prepares for growth. The increasing focus on customer centricity, enhanced and speedier product development capabilities, and compliance with regulatory changes has forced companies to rethink well-entrenched policy administration processes. In previous Oracle Insurance blogs I’ve highlighted industry research pointing to policy administration replacement as a top IT priority for carriers. It is predicted that by 2013, the global IT spend on policy administration alone is likely to be almost 22 percentage of the total insurance IT spend. To achieve growth, insurers are adopting new pricing models, enhancing distribution reach, and quickly launching new products and services—all of which depend on agile and effective policy administration processes and technologies. Next month speakers from Oracle Insurance and Capgemini Financial Services will discuss how insurers can competitively drive high performance through policy administration replacement during a free, one-hour webcast hosted by LOMA. Roger Soppe, Oracle senior director, Insurance Strategy, together with Capgemini’s Lars Ernsting, leader, Life & Pensions COE, and Scott Mampre, vice president, Insurance, will be the speakers. Specifically, they’ll be highlighting: How replacing a legacy policy administration system with a modern, flexible platform optimizes IT and operations costs, creates consistent processes and eliminates resource redundancies How selecting the right partner with the best blend of technology, operational, and consulting capabilities, is an important pre-requisite to unlock high performance from policy administration transformation to achieve product, operational, and cost leadership  The value of outsourcing closed block operations We look forward to your participation on Thursday, July 14, 11:00 a.m. ET. Please register now. Helen Pitts is senior product marketing manager for Oracle Insurance's life and annuities solutions.

    Read the article

  • How to display values from another website to an new html page?

    - by user3098728
    How to display the value in a new html file from different website? This an example field of values that need to display into new html file and I want to display the said values in the input box (Contract ID) of this page JSFiddle. I have 2 JS code that would display that values, but unfortunately its not working and I dont know how to display that value in html input box. Please help me. Thank you I want to display the said value in this input box: Here the JS file to read the values: function scanLapVerification() { try { var page_title = "Title"; var el = getElement(document, "class", "view-operator-verification-title", ""); if (!el || el.length == 0) return; if (el[0].innerText != page_title) return; var page_title = ''; var el = getElement(document, "class", "workflowActivityDetailPanel", ""); if (el && el.length > 0) { var eltr = getElement(el[0], "tag", "tr", ""); if (eltr && eltr.length > 0) { //Read Contract ID var contractId = { CI: { id: null } }; var con_id = null; for (var i = 0; i < eltr.length; i++) { tr_text = eltr[i].innerText; if (tr_text.substr(0, "Contract ID".length) == "Contract ID") con_id = "CI"; if (con_id && tr_text.substr(0, "Contract ID".length) == "Contract ID") { contractId[con_id].id = tr_text.substr("Contract ID".length + 1, tr_text.length - "Contract ID".length - 1); } } var contract_id = contractId.CI.id; return { content: "cid_check", con_id: con_id }; } return { status: "KO" }; } catch (e) { alert("Exception: scanLapVerification\n" + e.Description); return { status: "KO", message: e }; } }; And here's the 2nd JS that display to a new html page: function scanLapVerification() { chrome.tabs.sendRequest(tabLapVerification, { method: "scanLapVerification" }, function (response) { msgbox("receiveResponse: scanLapVerification " + jsonToString(response, "JSON")); //maintaining state in the background if (response.data.content == "cid_check") { //Popup window features var popupWindow = null; var name; var width = 550; var height = 200; var left = parseInt((screen.availWidth / 2) - (width / 2)); var top = parseInt((screen.availHeight / 2) - (height / 2)); var windowFeatures = "width=" + width + ",height=" + height + ",left=" + left + ",top=" + top + "screenX=" + left + ",screenY=" + top; //Input new address with popup window if (confirm("Does the client has new address?") == true) { popupWindow = window.open('/htmlname.htm', "title", windowFeatures + encodeURIComponent(response.data.contract_id)); popupWindow.focus(); } else { name = ""; } }); }

    Read the article

  • Which statically typed languages support intersection types for function return values?

    - by stakx
    Initial note: This question got closed after several edits because I lacked the proper terminology to state accurately what I was looking for. Sam Tobin-Hochstadt then posted a comment which made me recognise exactly what that was: programming languages that support intersection types for function return values. Now that the question has been re-opened, I've decided to improve it by rewriting it in a (hopefully) more precise manner. Therefore, some answers and comments below might no longer make sense because they refer to previous edits. (Please see the question's edit history in such cases.) Are there any popular statically & strongly typed programming languages (such as Haskell, generic Java, C#, F#, etc.) that support intersection types for function return values? If so, which, and how? (If I'm honest, I would really love to see someone demonstrate a way how to express intersection types in a mainstream language such as C# or Java.) I'll give a quick example of what intersection types might look like, using some pseudocode similar to C#: interface IX { … } interface IY { … } interface IB { … } class A : IX, IY { … } class B : IX, IY, IB { … } T fn() where T : IX, IY { return … ? new A() : new B(); } That is, the function fn returns an instance of some type T, of which the caller knows only that it implements interfaces IX and IY. (That is, unlike with generics, the caller doesn't get to choose the concrete type of T — the function does. From this I would suppose that T is in fact not a universal type, but an existential type.) P.S.: I'm aware that one could simply define a interface IXY : IX, IY and change the return type of fn to IXY. However, that is not really the same thing, because often you cannot bolt on an additional interface IXY to a previously defined type A which only implements IX and IY separately. Footnote: Some resources about intersection types: Wikipedia article for "Type system" has a subsection about intersection types. Report by Benjamin C. Pierce (1991), "Programming With Intersection Types, Union Types, and Polymorphism" David P. Cunningham (2005), "Intersection types in practice", which contains a case study about the Forsythe language, which is mentioned in the Wikipedia article. A Stack Overflow question, "Union types and intersection types" which got several good answers, among them this one which gives a pseudocode example of intersection types similar to mine above.

    Read the article

  • Uninstalling with Ubuntu Software Center doesn't work on Ubuntu 12.04.1 64bit

    - by likethesky
    Not sure if I'm doing something wrong, or if the .deb package I'm installing is broken in some way (I've built it, using NetBeans 7.2), or if indeed this is a bug in Software Center. When I install this particular 32-bit .deb on Ubuntu 10.04 LTS--all updates applied--(where it was built), GDebi shows it and has an 'Uninstall' button next to it. So it works fine to uninstall it there, via the GDebi GUI. However, when I install it on 12.04.1 LTS--all updates applied--it installs fine, but then does not show up in Ubuntu Software Center as available to be uninstalled. No combination of searching finds it. However, I can from the command line, do sudo apt-get purge javafxapplication1 and it finds it and deletes it. The same thing happens when I build a 64-bit .deb and attempt to install it to the same (64-bit AMD) or a different 64-bit Ubuntu 12.04.1 system. So it seems to be isolated to this NetBeans-generated .deb and the 64-bit AMD build (though I haven't tried it on a 32-bit 12.04.1 install yet). These are all on VirtualBox VMs, btw, if that matters. Any way to clean up my Software Center and see if it's something I've done to get it in this state? Could this behavior be due to how this particular .deb has been built? (It doesn't have an 'Installed-Size' control field, so I do get the "Package is of bad quality" warning when I install it--which I do by clicking 'Ignore and install' button.) If you want all the gory details about why this happening--a bug has been reported against NetBeans for this behavior here: http://javafx-jira.kenai.com/browse/RT-25486 (EDIT: Just to be clear, the app installs fine, runs fine, all works as intended--I just can't get that 'bad package' message to go away, and now... I also can't uninstall it via Software Center, but rather, need to use sudo apt-get purge to uninstall it, after it installs.) Thanks for any pointers. I'm happy to report this as a bug against Ubuntu Software Center/Centre too, if that's what it seems to be, just tell me where to do so (a link). I'm a relative Ubuntu, NetBeans, and JavaFX newbie, though a long-time programmer. If I report it as a bug, I'll try it on the 32-bit build of 12.04.1 as well. Also, if I should add any more detail to the bug reported against NetBeans above, let me know--or feel free to add it yourself to the bug report above, if you would like. Thanks again!

    Read the article

  • Where can you find the Oracle Applications User Experience team in the next several months?

    - by mvaughan
    By Misha Vaughan, Applications User ExperienceNovember is one of my favorite times of year at Oracle. The blast of OpenWorld work is over, and it’s time to get down to business and start taking our messages and our work on the road out to the user groups. We’re in the middle of planning all of that right now, so we decided to provide a snapshot of where you can see us and hear about the Oracle Applications User Experience – whether it’s Fusion Applications, PeopleSoft, or what we’re planning for the next-generation of Oracle Applications.On the road with Apps UX...In December, you can find us at UKOUG 2012 in Birmingham, UK: UKOUG, UK Oracle User Group Conference 2012?December 3 – 5, 2012?ICC, Birmingham, UKIn March, we will be at Alliance 2013 in Indianapolis, and our fingers are crossed for OBUG Connect 2013 in Antwerp:? Alliance 2013March 17 - 20, 2013 ?Indianapolis, IndianaOBUG Benelux Connect 2013?March 26, 2013?Antwerp, Belgium?? In April, you will see us at COLLABORATE13 in Denver:? Collaborate13April 7 - April 11, 2013 ?Denver, Colorado?? And in June, we round out the kick-off to summer at OHUG 2013 in Dallas and Kscope13 in New Orleans:? OHUG 2013June 9 -13, 2013?Dallas, Texas ODTUG Kscope13?June 23-27, 2013 ?New Orleans, LA? The Labs & DemosAs always, a hallmark of our team is our mobile usability labs. If you haven’t seen them, they are a great way for customers and partners to get a peek at what Oracle is working on next, and a chance for you to provide your candid perspective. Based on the interest and enthusiasm from customers last year at Collaborate, we are adding more demo-stations to our user group presence in the year ahead. If you want to see some of the work we are doing first-hand but don’t have a lot of time, the demo stations are a great way to get a quick update on the latest wow-factor we are researching. I can promise that you will see whatever we think is new and interesting at the demo stations first. Oracle OpenWorld 2012 Apps UX DemostationFor Applications DevelopersMore and more, I get asked the question, “How do I build an application that looks like a Fusion?” My answer is Fusion Applications Design Patterns. You can find out more about how Fusion Applications developers can leverage ADF and the user experience best practices we developed for Fusion at sessions lead by Ultan O’Broin, Director of Global User Experience, in the year ahead. Ultan O'Broin, On Fusion Design Patterns Building mobile applications are also top of mind these days. If you want to understand how Oracle is approaching this strategy, check out our session on Mobile user experience design patterns with Mobile ADF.  In many cases, this will be presented by Lynn Rampoldi-Hnilo, Senior Manager of Mobile User Experiences, and in a few cases our ever-ready traveler Ultan O’Broin will be on deck. Lynn Rampoldi-Hnilo, on Mobile User Experience Design PatternsApplications User ExperiencesFusion Applications continues to evolve, and you will see the new face of Fusion Applications at our executive sessions in the year ahead, which are led by vice president Jeremy Ashley or a hand-picked presenter, such as one of our Fusion User Experience Advocates.  Edward Roske, CEO InterRel Consulting & Fusion User Experience AdvocateAs always, our strategy is to take our lessons learned and spread them across the Applications product lines. A great example is the enhancements coming in the PeopleSoft user experience, which you can hear about from Harris Kravatz, Senior Manager, PeopleSoft User Experience. Fusion Applications ExtensibilityWe can’t talk about Fusion Applications without talking about how to make it look like your business. If tailoring Fusion applications is a question in your mind, and it should be, you should hit one of these sessions. These sessions will be lead by our own Killian Evers, Senior Director, Tim Dubois, User Experience Architect, and some well-trained Fusion User Experience Advocates.Find out moreIf you want to stay on top of where and when we will be, you can always sign up for our newsletter or check out the events page of usableapps.

    Read the article

  • Create Shortcuts for Your Favorite or Most Used Folders in Ubuntu

    - by Asian Angel
    Do you have certain folders that you access often each day but are only available through the Places Menu or Nautilus? See how easy it is to create shortcuts for your desktop and taskbar with our quick tutorial. To get started open Nautilus and locate the folders that you want to make new shortcuts for. For our example we chose Ubuntu One. Right click on the chosen folder and select Make Link. Your new shortcut will appear with the text Link to “Folder Name” and an Arrow Shortcut Marker attached. If you are happy with your new shortcut as is, then drag it to your desktop or taskbar as desired. We created the shortcut twice in our example…once for the desktop and once for the taskbar. For our example we decided to customize the taskbar shortcut a bit. To customize your shortcut right click on the shortcut and select Properties. Note: The desktop shortcut is limited on the amount you can customize it (name change and addition of up to four emblems to the folder). From here you can rename the shortcut and change the icon as desired. A quick name change and new icon made a huge improvement in how our taskbar shortcut looked. Note: The link for the icon we used is shown below. A little touch-up to our desktop shortcut and both are looking good. Download the Ubuntu Cloud Icon *Icon is 128*128 pixels and comes in .png format. Latest Features How-To Geek ETC Macs Don’t Make You Creative! So Why Do Artists Really Love Apple? MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Create Shortcuts for Your Favorite or Most Used Folders in Ubuntu Create Custom Sized Thumbnail Images with Simple Image Resizer [Cross-Platform] Etch a Circuit Board using a Simple Homemade Mixture Sync Blocker Stops iTunes from Automatically Syncing The Journey to the Mystical Forest [Wallpaper] Trace Your Browser’s Roots on the Browser Family Tree [Infographic]

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • Mixed Solaris 10 and 11 versions in logical domains on the same server

    - by jsavit
    One question that comes up frequently is whether you can mix Solaris 10 and Solaris 11 in different logical domains under Oracle VM Server for SPARC. The answer is yes depending only on the system software requirements for the underlying hardware platform. Different versions of Solaris 10 and 11 can exist side-by-side on the same server and can act as control, service, I/O or guest domains subject only to the minimum software levels documented in the System Requirements section of the Oracle VM Server for SPARC Release Notes. Here's an example just taken from a running system. First, here's the control domain, which is running Solaris 10. I've highlighted a guest running Solaris 11. # uname -a SunOS atl-sewr-24 5.10 Generic_147440-01 sun4v sparc SUNW,SPARC-Enterprise-T5220 # ldm -V Logical Domains Manager (v 2.1) Hypervisor control protocol v 1.7 Using Hypervisor MD v 1.3 System PROM: Hypervisor v. 1.10.0 @(#)Hypervisor 1.10.0 2011/04/27 16:19\015 # ldm list NAME STATE FLAGS CONS VCPU MEMORY UTIL UPTIME primary active -n-cv- SP 16 4G 1.6% 120d 17h atl-sewr-pool-148 active -n---- 5001 8 2G 0.1% 119d 21h atl-sewr-pool-152 active -n---- 5000 8 4G 0.2% 112d 19h atl-sewr-pool-154 active -n---- 5002 8 2G 0.1% 120d 15h atl-sewr-pool-155 active -n---- 5003 16 2G 0.0% 26d 14h 30m This system is running Oracle VM Server 2.1 with a Solaris 10 control domain. Hmm, I should update this machine to 2.2 when I get a few free moments. Upgrading is very straightforward. Here's a display logging into the highlighted guest: Last login: Mon May 21 10:18:16 2012 from dhcp-adc-twvpn- Oracle Corporation SunOS 5.11 11.0 November 2011 sewr@atl-sewr-pool-152:~$ uname -a SunOS atl-sewr-pool-152 5.11 11.0 sun4v sparc SUNW,SPARC-Enterprise-T5220 sewr@atl-sewr-pool-152:~$ cat /etc/release Oracle Solaris 11 11/11 SPARC Copyright (c) 1983, 2011, Oracle and/or its affiliates. All rights reserved. Assembled 18 October 2011 sewr@atl-sewr-pool-152:~$ sudo virtinfo -ct Password: Domain role: LDoms guest Control domain: atl-sewr-24 sewr@atl-sewr-pool-152:~$ That's running the GA version of Solaris 11, so I probably should update that some time too. Note the use of the virtinfo -ct command that lets the guest get information about the hosting environment. Summary You can mix and match versions of Solaris in logical domains. All the different combinations work: Solaris 10 and/or Solaris 11 control and service domains with Solaris 10 and/or Solaris 11 guests. Mixing different guest OS levels on the same server is one of the traditional reasons for using virtual machines in the first place since virtual machines were invented some 40 years ago, used to run production and test systems in parallel while upgrading OS levels. This can easily be done with Oracle VM Server for SPARC (Logical Domains).

    Read the article

  • Characteristics of a Web service that promote reusability and change

    Characteristics of a Web service that promote reusability and change:  Standardized Data Exchange Formats (XML, JSON) Standardized communication protocols (Soap, Rest) Promotes Loosely Coupled Systems  Standardized Data Exchange Formats (XML, JSON) XML W3.org defines Extensible Markup Language (XML) as a simplistic text format derived from SGML. XML was designed to solve challenges found in large-scale electronic publishing. In addition,  XML is playing an important role in the exchange of data primarily focusing on data exchange on the web. JSON JavaScript Object Notation (JSON) is a human-readable text-based standard designed for data interchange. This format is used for serializing and transmitting data over a network connection in a structured format. The primary use of JSON is to transmit data between a server and web application. JSON is an alternative to XML. Standardized communication protocols (Soap, Rest) Soap W3Scools.com defines SOAP as a simple XML-based protocol. This protocol lets applications exchange data over HTTP.  SOAP provides a way to communicate between applications running on different operating systems, with different technologies and programming languages. Rest In 2007, Stefan Tilkov defines Representational State Transfer (REST) as a set of principles that outlines how Web standards are supposed to be used.  Using REST in an application will ensure that it exploits the Web’s architecture to its benefit. Promotes Loosely Coupled Systems “Loose coupling as an approach to interconnecting the components in a system or network so that those components, also called elements, depend on each other to the least extent practicable. Coupling refers to the degree of direct knowledge that one element has of another.” (TechTarget.com, 2007) “Loosely coupled system can be easily broken down into definable elements. The extent of coupling in a system can be measured by mapping the maximum number of element changes that can occur without adverse effects. Examples of such changes include adding elements, removing elements, renaming elements, reconfiguring elements, modifying internal element characteristics and rearranging the way in which elements are interconnected.” (TechTarget.com, 2007) References: W3C. (2011). Extensible Markup Language (XML). Retrieved from W3.org: http://www.w3.org/XML/ W3Scools.com. (2011). SOAP Introduction. Retrieved from W3Scools.com: http://www.w3schools.com/soap/soap_intro.asp Tilkov, Stefan. (2007). A Brief Introduction to REST. Retrieved from Infoq.com: http://www.infoq.com/articles/rest-introduction TechTarget.com. (2011). loose coupling. Retrieved from TechTarget.com: http://searchnetworking.techtarget.com/definition/loose-coupling

    Read the article

  • Windows Azure Recipe: Social Web / Big Media

    - by Clint Edmonson
    With the rise of social media there’s been an explosion of special interest media web sites on the web. From athletics to board games to funny animal behaviors, you can bet there’s a group of people somewhere on the web talking about it. Social media sites allow us to interact, share experiences, and bond with like minded enthusiasts around the globe. And through the power of software, we can follow trends in these unique domains in real time. Drivers Reach Scalability Media hosting Global distribution Solution Here’s a sketch of how a social media application might be built out on Windows Azure: Ingredients Traffic Manager (optional) – can be used to provide hosting and load balancing across different instances and/or data centers. Perfect if the solution needs to be delivered to different cultures or regions around the world. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model has extensibility points to hook into other identity providers as well. Web Role – hosts the core of the web application and presents a central social hub users. Database – used to store core operational, functional, and workflow data for the solution’s web services. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Tables (optional) – for semi-structured data streams that don’t need relational integrity such as conversations, comments, or activity streams, tables provide a faster and more flexible way to store this kind of historical data. Blobs (optional) – users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Content Delivery Network (CDN) (optional) – for sites that service users around the globe, the CDN is an extension to blob storage that, when enabled, will automatically cache frequently accessed blobs and static site content at edge data centers around the world. The data can be delivered statically or streamed in the case of rich media content. Training These links point to online Windows Azure training labs and resources where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Unit testing in Django

    - by acjohnson55
    I'm really struggling to write effective unit tests for a large Django project. I have reasonably good test coverage, but I've come to realize that the tests I've been writing are definitely integration/acceptance tests, not unit tests at all, and I have critical portions of my application that are not being tested effectively. I want to fix this ASAP. Here's my problem. My schema is deeply relational, and heavily time-oriented, giving my model object high internal coupling and lots of state. Many of my model methods query based on time intervals, and I've got a lot of auto_now_add going on in timestamped fields. So take a method that looks like this for example: def summary(self, startTime=None, endTime=None): # ... logic to assign a proper start and end time # if none was provided, probably using datetime.now() objects = self.related_model_set.manager_method.filter(...) return sum(object.key_method(startTime, endTime) for object in objects) How does one approach testing something like this? Here's where I am so far. It occurs to me that the unit testing objective should be given some mocked behavior by key_method on its arguments, is summary correctly filtering/aggregating to produce a correct result? Mocking datetime.now() is straightforward enough, but how can I mock out the rest of the behavior? I could use fixtures, but I've heard pros and cons of using fixtures for building my data (poor maintainability being a con that hits home for me). I could also setup my data through the ORM, but that can be limiting, because then I have to create related objects as well. And the ORM doesn't let you mess with auto_now_add fields manually. Mocking the ORM is another option, but not only is it tricky to mock deeply nested ORM methods, but the logic in the ORM code gets mocked out of the test, and mocking seems to make the test really dependent on the internals and dependencies of the function-under-test. The toughest nuts to crack seem to be the functions like this, that sit on a few layers of models and lower-level functions and are very dependent on the time, even though these functions may not be super complicated. My overall problem is that no matter how I seem to slice it, my tests are looking way more complex than the functions they are testing.

    Read the article

  • Stuck with Documentum Still? Do MORE with Oracle WebCenter!

    - by Michael Snow
    WEBCAST TODAY!! 03/22/12 Do you need to lower costs? Raise Productivity? Foster Innovation? Improve Online Engagement? But you’re still stuck with Documentum? Step away from the ledge – there is hope – let us help you. Top 4 Content Imperatives · Lower Costs - Reduce labor, maintenance fees, storage and electrical consumption · Raise Productivity - Automation and integration, communication, findability · Foster Innovation - Enable collaboration, expertise location · Improve Online Engagement – enable user-driven, dynamic marketing initiatives With the coming technology wave we see four content imperatives. Every organization has had to reduce costs, cost cutting has become a way of life. Everyone is working three jobs as positions are eliminated. And so we have to reduce labor, reduce maintenance, and reduce money we are wasting on things like storing content that is redundant or no longer useful. We also, to fill that gap, need to raise productivity. Knowledge workers represent the fastest growing segment of the workforce, accounting for 40%-75% of the employees at organizations in sectors like financial services, life sciences, healthcare and retail.  What’s more, their wages total 18 percent of the United States GDP. And so we can’t afford information systems that don’t let our top performers be the best they can be. We look to automate the content processes, provide ways to integrate that content into our processes, provide communication to make decisions, and to make content more findable so people can make the right decision and move the process forward. And really to get ourselves out of the current financial status, we can only cut costs so far. We have to innovate out of economic tough times – to find new products and new markets. And to enable the innovation process, we have to enable collaboration and expertise location. So much of innovation is about building on innovations that have come before. To solve problems, we have to be able to find what our organization has already created. We find that problems we need to solve have already been solved if we can find the right document, the right person. So we have to provide systems that enable us to stand on the shoulders of our organization’s accomplishments. Good content drives great marketing. Online engagement is growing as an absolute necessity for modern growing marketing organizations that require the business users be enabled for dynamic marketing content creation, updates and targeted content creation and management. Unfortunately – if you are currently stuck with Documentum, you are really lacking in your Web Experience Management capabilities. Documentum previously used FatWire for web publishing. Now FatWire is part of Oracle. Oracle provides powerful web engagement capabilities: Increase sales and loyalty by optimizing online engagement Create, manage and moderate contextually relevant, targeted and interactive online experiences Optimize customer engagement across, web, mobile and social channels Manage large scale multichannel global online presence with integration to enterprise applications Enable business users to control their content and make their own updates Publish content from native files – enable navigation of project documents, procedures, policy information Enable content display and updates from existing web applications – one click to drag and drop content management functionality So you get the ability to self-publish information and make it navigable, to move the process of publishing from IT to business users, and the ability to address a whole new area of user engagement with web experience management. So… if you are still stuck with Documentum and don’t know what to do – contact us – not only will Oracle help you step away from the ledge, but also with the MoveOff Documentum program, we are offering you a way – trade-in your Documentum licenses for a 100% credit on Oracle WebCenter. How’s that for a nice bonus? It’s time to stop maintaining Documentum, and to start innovating with Oracle WebCenter. Learn More Here! To learn more about what Oracle WebCenter can offer you today – join us for a webcast – your eyes will be opened to all that’s possible. Do More with WebCenter: Extend Beyond Content Management

    Read the article

  • Join our Marketing Intelligence Team in Dublin!

    - by jessica.ebbelaar
    Do you want to work with the brightest minds in the industry? Want to be part of a global team that’s changing the way the world does business? Then Oracle is the place for YOU. Join now as a Marketing Intelligence Representative. You will have the opportunity to develop within the role through working alongside the Business Development, Sales and Marketing teams within Oracle. The Marketing Intelligence Group is viewed as a true talent pool for the Business Development and Sales Teams. Oracle offers a structured training programme for Marketing Intelligence Representatives and Business Development Consultants including our approved sales certified training methodology along with regular product training. Miriam started her career as a Marketing Intelligence Representative six years ago, and shares what she has learned and how her career is progressing. My Career Path at Oracle: June 2005 – October 2005: Profiler in the Marketing Intelligence Team November 2005 - October 2006: Team Leader for MIT November 2006 - February 2008: Business Development Consultant Iberia March 2008 - December 2010: Lead Management Specialist Currently: Sales Program Manager for Iberia & Benelux What did you learn from your role in the Market Intelligence Team Being a Profiler helped me to understand how an organisation works, from the beginning to the end. It is like being in University but being paid! The three key things I learnt in this role are: Knowledge of customers: You are on the phone with over 70 customers daily. Not only does this give you an overview of the IT infrastructure of the customers companies but also how to manage their questions and rejections. Essentially you are learning how to convert their pain and complaints into business opportunities. Knowledge of Oracle: As a Profiler you get an excellent overview of how Oracle works internally, from Marketing to Sales, without forgetting the Operations Team. Knowledge about yourself: As a Profiler I learnt how to work outside of my comfort zone, there is a new challenge almost every day but Oracle are there to support you every step of the way. Oracle really invests in developing the MIT Team and as a Profiler you can expect product and sales training on a monthly basis. How did you progress from MIT to Business Development Group (BDG)? I made sure that my manager knew from the very beginning that I was keen to progress at Oracle and I was set very clear objectives to help me reach my goal.  My manager was very supportive and ensured I received all the training I needed. After I became a Team Leader of Profiling, I moved to an Iberia BDG position. How you feel your experience in MI has helped you in your current role? I truly believe that the MI position gives you a great overview of Oracle and this has really helped me in my current position.  I am the Sales Program Manager for IBERIA & Benelux and in my campaigns I need to target the right companies and the right job specs.  My time in the Market Intelligence team really helped me to understand how to focus and target my campaigns so I know I don’t miss any business opportunities! How would you sum up your Oracle experience? Oracle is a big organisation with big opportunities. With the right skills and with the great training programs that Oracle offer, the only limit is you! If you have any questions related to this article feel free to contact [email protected] You can find all our job opportunities via http://campus.oracle.com. Tags van Technorati: Marketing Intelligence,Benelux,Iberia,Profiler,Business Development,Sales Representatives,BDG,Business Development Group,opportunities,Oracle

    Read the article

< Previous Page | 442 443 444 445 446 447 448 449 450 451 452 453  | Next Page >