Search Results

Search found 15798 results on 632 pages for 'authentication required'.

Page 468/632 | < Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >

  • WebCenter Customer Spotlight: Regency Centers Corporation

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryRegency Centers Corporation, based in Jacksonville, FL, is a leading national owner, operator, and developer of grocery-anchored and community shopping centers. Regency grew rapidly over much of the last decade. To keep up with the monthly and yearly administrative processes required to manage thousands of tenants, including reconciling yearly pass-through expenses, the customer upgraded to Oracle’s JD Edwards EnterpriseOne Version 9.0 and deployed Oracle WebCenter Imaging, Process Management and Oracle BI Publisher, to streamline invoice processing and reporting. Using Oracle WebCenter Imaging - Regency accelerated and improved vendor invoice accuracy  which increases process integrity by identifying potential duplicate bills while enabling rapid approval of electronic invoice documents. Company Overview Regency Centers Corporation, based in Jacksonville, FL,  is a leading national owner, operator, and developer of grocery-anchored and community shopping centers. The company owns 367 centers, totaling nearly 50 million square feet, located in top markets throughout the United States. Founded in 1963 and operating as a fully integrated real estate company, Regency is a qualified real estate investment trust that is self-administered and self-managed, operating from 17 regional offices around the country.  Business Challenges Ensure continued support of vital business applications that drive the real estate developer’s key business processes, including property management and tenant payment processing Streamline year-end expense recognition and calculation, enabling faster tenant billing Move to a Web-based platform to deliver greater mobility and convenience to employees Minimize system customizations to reduce IT management costs and burden moving forward Solution DeployedRecency Centers Corporation worked with the  Oracle Partner ICS to upgrade to Oracle’s JD Edwards EnterpriseOne Version 9.0, migrating to a more user-friendly, Web-based platform and realizing numerous new efficiencies in property management and tenant payment processing. They accelerated and improved vendor invoice accuracy with Oracle WebCenter Imaging, which increases process integrity by identifying potential duplicate bills while enabling rapid approval of electronic invoice documents. Business Results Enabled faster and more accurate tenant billing for year-end expenses, accelerating collections of millions of dollars in revenue Gained full audit and drill-down capabilities that facilitate understanding various aspects of calculations for expense participation generation Increases process integrity by identifying potential duplicate bills while enabling rapid approval of electronic invoice documents Helped to ensure on-time payments to hundreds of vendors, including contractors and utilities "We have realized numerous efficiencies with Oracle’s JD Edwards EnterpriseOne 9.0, particularly around tenant billings. It accelerates our year-end expense reconciliation process and enables us to create and process billings more quickly.” James Chiang, Vice President of Real Estate Accounting Regency Centers Corporation Additional Information Regency Centers Corporation Customer Snapshot Oracle WebCenter Imaging JD Edwards EnterpriseOne Financials 9.0 JD Edwards EnterpriseOne Project Costing JD Edwards EnterpiseOne Real Estate Management Oracle Business Intelligence Publisher Oracle Essbase

    Read the article

  • Oracle HCM Cloud Customer Q&A with WAXIE Sanitary Supply

    - by HCM-Oracle
    At this year’s Oracle HCM User Group (OHUG) Global conference, we had the opportunity to sit down with Oracle HCM Cloud customers for a short Q&A. We got to hear about what brought them to the OHUG conference, some of the benefits they are receiving from their Oracle HCM Cloud solutions, and advice they would give other businesses looking to move to the cloud.  Below is a discussion we had with Melissa Halverson, Benefits & HRIS Manager at WAXIE Sanitary Supply.  Q: What made you attend the OHUG Global Conference this year? Halverson: The biggest reason is networking. It allows me to connect with others in the Oracle HCM Cloud community. I was able to speak at the HCM Cloud SIG (Special Interest Group) on the first day and share my experiences as well as hear the experiences of other Oracle HCM Cloud users. It also allows me to get face-time with key people within Oracle.  Q: What Oracle HCM solutions are you currently using? Halverson: Global HR, Benefits, Workforce Compensation, and Performance Management. Q: Do you plan to invest further in Oracle HCM? Halverson: Yes, we are interested in Time and Labor. We would also like to get Recruiting at some point in the future. Q: What would you say is the most significant benefit you’ve realized from your use of Oracle HCM solutions? Halverson: First and foremost would be process improvement. Before we had Oracle HCM Cloud we relied on a paper process where something as simple as an employee address change required changes to be made manually in 9 different systems. Obviously that was extremely inefficient, but also increased the likelihood of errors being made.  The other huge benefit we have seen was in making information visible to the people that need it. Prior to implementing Oracle HCM Cloud, it was very difficult for anyone to access and make use of the information in our systems. Now, we can provide this information to those who need it to make better decisions.  Q: What advice would you give an organization looking to move their HR systems to the cloud? Halverson: One thing I think many organizations don't spend enough time doing is thoroughly vetting their implementation partner. I believe you should be vetting your implementation partner as much as you did the system itself. Also, manpower is so important. Involve as large a team as possible because you don’t want to get stuck having too few bodies to help out. And set realistic time frames. Biting off more than you can chew will inevitably result in failure. Having a phased approach is always best rather than trying to do everything at once. Thanks for the tips Melissa. Enjoy the rest of the conference!

    Read the article

  • Live Virtual Class for Partners: Application Management

    - by Patrick Rood
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} November 11-12th Manageability Partner Community Application Management Suite Live Virtual Training This training will be offered to Oracle Partners over a live webcast during business hours. Each day will consist of approximately 2-3 hours of lecture/demos. It will be recorded and available for playback. Purpose: This virtual course is a comprehensive program of training sessions, prepared and presented by Product Managers. This ensures you have all the information you need to position and sell Oracle Application Management Suites. The sessions will be lecture based with demonstrations to complement. These sessions are interactive and everyone will be required to participate. Customer case studies will be used as appropriate and there will be plenty of opportunity for in-depth discussion. Please bring to the training an understanding of what Enterprise Manager 12c does for our customers, along with your own experiences to date. Logistics: Topic: Oracle Application Management Suite Training (2 Days - approx 2-3 Hour per Day) WebEx session details to be provided upon registration. Monday 11th November | 14:00PM GMT | 18:00PM Gulf (GMT+4) Tuesday 12th November | 14:00PM GMT | 18:00PM Gulf (GMT+4) (Back to the top) Copyright © 2012, Oracle. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • How to Enable JavaScript file API in IE8 [closed]

    - by saeed
    i have developed a web application in asp.net , there is a page in this project which user should choose a file in picture format (jpeg,jpg,bmp,...) and i want to preview image in the page but i don't want to post file to server i want to handle it in client i have done it with java scripts functions via file API but it only works in IE9 but most of costumers use IE8 the reason is that IE8 doesn't support file API is there any way to make IE8 upgrade or some patches in code behind i mean that check if the browser is IE and not support file API call a function which upgrades IE8 to IE9 automatically. i don't want to ask user to do it in message i want to do it programmatic !! even if it is possible install a special patch that is required for file API because customers thought it is a bug in my application and their computer knowledge is low what am i supposed to do with this? i also use Async File Upload Ajax Control But it post the file to server any way with ajax solution and http handler but java scripts do it all in client browser!!! following script checks the browser supports API or not <script> if (window.File && window.FileReader && window.FileList && window.Blob) document.write("<b>File API supported.</b>"); else document.write('<i>File API not supported by this browser.</i>'); </script> following scripts do the read and Load Image function readfile(e1) { var filename = e1.target.files[0]; var fr = new FileReader(); fr.onload = readerHandler; fr.readAsText(filename); } HTML code: <input type="file" id="getimage"> <fieldset><legend>Your image here</legend> <div id="imgstore"></div> </fieldset> JavaScript code: <script> function imageHandler(e2) { var store = document.getElementById('imgstore'); store.innerHTML='<img src="' + e2.target.result +'">'; } function loadimage(e1) { var filename = e1.target.files[0]; var fr = new FileReader(); fr.onload = imageHandler; fr.readAsDataURL(filename); } window.onload=function() { var x = document.getElementById("filebrowsed"); x.addEventListener('change', readfile, false); var y = document.getElementById("getimage"); y.addEventListener('change', loadimage, false); } </script>

    Read the article

  • Sometimes keyboard & touchpad work... sometimes not

    - by Voyagerfan5761
    When I first ran Ubuntu from CD on this Dell Inspiron 2650, it worked for about ten to fifteen minutes, then it hung (I was probably trying to do too much at once from a Live CD). The next time, my mouse and keyboard didn't work. I rebooted three times and finally got them working. I then installed Ubuntu alongside Windows XP. After installing, selecting the OS in GRUB worked, but my touchpad and keyboard were again not working. I rebooted, and they worked. (I fortunately had a USB mouse with which to reboot.) Booting Ubuntu and then rebooting to enable my keyboard and touchpad has become a routine ever since. Often several reboots are required; at one point I had to reboot over a dozen times in a row before getting a session where everything worked properly. (My installation has been in place for about three days a week now.) I've looked around for a device manager equivalent to no avail. Sometimes the hardware is properly detected, and sometimes it's not. Once or twice I've had the keyboard detected properly but the touchpad not. Plugging in my wireless card also sometimes requires a plug, unplug, and plug again to get it working. So is there some solution? I'm without an Internet connection at home, and this "laptop" is really a wall wart on my desk, so suggestions for packages may take a while to test. Xorg logs I captured two three four sample Xorg logs: one from a startup where the devices worked; one from when they didn't; one from a session where Ubuntu thought my touchpad was a normal mouse; and one from a session where my keyboard worked but the touchpad didn't. See this gist. Updated 2010-12-15 01:50 UTC with Xorg.0.log.keyboardonly file illustrating the case where the keyboard worked but not the touchpad. Updated 2011-01-11 04:10 UTC with Xorg.0.log.touchpadregmouse to illustrate a case where the touchpad was detected as a regular mouse (no "Touchpad" tab in mouse prefs).

    Read the article

  • BizTalk 2010 Certification Exam

    - by Paul Petrov
    I took a shot at new (to me) certification exam for BizTalk 2010. I was able to pass it without any preparation just based on the experience. That does not mean this exam is a very simple one. Comparing to previous (2006 R2) it covers some new areas (like WCF) and has some demanding questions and situation to think about. But the most challenging factor is broad feature coverage. Overall, the impression that if BizTalk continues to grow in scope it’s better to create separate exams for core functionality and extended features (like EDI, RFID, LOB adapters) because it’s really hard to cover vast array of BizTalk capabilities. As far as required knowledge and questions allocation I think Microsoft description is on target. There were definitely more questions on deployment, configuration and administration aspects comparing to previous exam. WCF and WCF based adapters now play big role and this topic was covered well too. Extended functionality is claimed at 13% of the exam, I felt there were plenty of RFID questions but not many EDI, that’s why I thought it’d be useful to split exam into two to cover all of them equally. BRE is still there and good, cause it’s usually not very known/loved feature of the package. At the and, for those who plan to get certified, my advice would be to know all those areas of BizTalk for guaranteed passing: messaging and orchestrations, core adapters, routing, patterns; development of all artifacts and orchestrations; debugging and exceptions handling; packaging, deployment, tracking and administration; WCF bindings and adapters; BAM, BRE, RFID, EDI, etc. You may get by not knowing one smaller non-essential part (like I did with RFID, for example). In such case you better know all other areas very well to cover for the weak spot. If there more than one whiteouts in the knowledge it’s good idea to study and prepare: MSDN, blogs, virtual labs and good VM to play with can help when experience is not enough. So best wishes and good skill to you in passing this certification!

    Read the article

  • "Package dependencies cannot be resolved" error when installing software

    - by Savitha
    Iam getting a problem while install media player packages. Package dependencies cannot be resolved This error could be caused by required additional software packages which are missing or not installable. Furthermore there could be a conflict between software packages which are not allowed to be installed at the same time. Depends: libc6 (>= 2.7) but 2.13-0ubuntu13 is to be installed Depends: libglib2.0-0 (>= 2.24.0) but 2.28.6-0ubuntu1 is to be installed Depends: libgstreamer-plugins-base0.10-0 (>= 0.10.22) but 0.10.32-1ubuntu5 is to be installed Depends: libgstreamer0.10-0 (>= 0.10.26) but 0.10.32-3ubuntu3 is to be installed Depends: liborc-0.4-0 (>= 1:0.4.10) but 1:0.4.11-2 is to be installed Depends: libpostproc-extra-51 (>= 4:0.6-1~) but 4:0.6.4-1ubuntu1+medibuntu1 is to be installed Depends: libswscale-extra-0 (>= 4:0.6-1~) but 4:0.6.4-1ubuntu1+medibuntu1 is to be installed gstreamer0.10-plugins-bad: Depends: libc6 (>= 2.7) but 2.13-0ubuntu13 is to be installed Depends: libcairo2 (>= 1.2.4) but 1.10.2-2ubuntu2 is to be installed Depends: libcdaudio1 (>= 0.99.12p2) but 0.99.12p2-9 is to be installed Depends: libdc1394-22 but it is not going to be installed Depends: libdirectfb-1.2-9 but it is not going to be installed Depends: libflite1 but it is not going to be installed Depends: libgcc1 (>= 1:4.1.1) but 1:4.5.2-8ubuntu4 is to be installed Depends: libglib2.0-0 (>= 2.26.0) but 2.28.6-0ubuntu1 is to be installed Depends: libgsm1 (>= 1.0.13) but it is not going to be installed Depends: libgstreamer-plugins-base0.10-0 (>= 0.10.32) but 0.10.32-1ubuntu5 is to be installed Depends: libgstreamer0.10-0 (>= 0.10.32) but 0.10.32-3ubuntu3 is to be installed Depends: libjasper1 (>= 1.900.1) but 1.900.1-7ubuntu2 is to be installed Depends: libmodplug1 but it is not going to be installed Depends: libmpcdec6 (>= 1:0.1~r435) but it is not going to be installed Depends: libmusicbrainz4c2a (>= 2.1.5) but it is not going to be installed Depends: libofa0 (>= 0.9.3) but it is not going to be installed Depends: liborc-0.4-0 (>= 1:0.4.10) but 1:0.4.11-2 is to be installed Depends: libpng12-0 (>= 1.2.13-4) but 1.2.44-1ubuntu3 is to be installed Depends: librsvg2-2 (>= 2.26.0) but 2.32.1-0ubuntu3 is to be installed Depends: librtmp0 (>= 2.3) but 2.3-2 is to be installed Depends: libschroedinger-1.0-0 (>= 1.0.9) but it is not going to be installed Depends: libsndfile1 (>= 1.0.20) but 1.0.23-1build1 is to be installed Depends: libstdc++6 (>= 4.1.1) but 4.5.2-8ubuntu4 is to be installed Depends: libvpx0 (>= 0.9.0) but it is not going to be installed gstreamer0.10-plugins-ugly: Depends: libc6 (>= 2.7) but 2.13-0ubuntu13 is to be installed Depends: libgcc1 (>= 1:4.1.1) but 1:4.5.2-8ubuntu4 is to be installed Depends: libglib2.0-0 (>= 2.24.0) but 2.28.6-0ubuntu1 is to be installed Depends: libgstreamer-plugins-base0.10-0 (>= 0.10.26) but 0.10.32-1ubuntu5 is to be installed Depends: libgstreamer0.10-0 (>= 0.10.26) but 0.10.32-3ubuntu3 is to be installed Depends: libid3tag0 (>= 0.15.1b) but it is not going to be installed Depends: libmad0 (>= 0.15.1b-3) but it is not going to be installed Depends: liborc-0.4-0 (>= 1:0.4.10) but 1:0.4.11-2 is to be installed Depends: libstdc++6 (>= 4.1.1) but 4.5.2-8ubuntu4 is to be installed

    Read the article

  • My Automated NuGet Workflow

    - by Wes McClure
    When we develop libraries (whether internal or public), it helps to have a rapid ability to make changes and test them in a consuming application. Building Setup the library with automatic versioning and a nuspec Setup library assembly version to auto increment build and revision AssemblyInfo –> [assembly: AssemblyVersion("1.0.*")] This autoincrements build and revision based on time of build Major & Minor Major should be changed when you have breaking changes Minor should be changed once you have a solid new release During development I don’t increment these Create a nuspec, version this with the code nuspec - set version to <version>$version$</version> This uses the assembly’s version, which is auto-incrementing Make changes to code Run automated build (ruby/rake) run “rake nuget” nuget task builds nuget package and copies it to a local nuget feed I use an environment variable to point at this so I can change it on a machine level! The nuget command below assumes a nuspec is checked in called Library.nuspec next to the csproj file $projectSolution = 'src\\Library.sln' $nugetFeedPath = ENV["NuGetDevFeed"] msbuild :build => [:clean] do |msb| msb.properties :configuration => :Release msb.targets :Build msb.solution = $projectSolution end task :nuget => [:build] do sh "nuget pack src\\Library\\Library.csproj /OutputDirectory " + $nugetFeedPath end Setup the local nuget feed as a nuget package source (this is only required once per machine) Go to the consuming project Update the package Update-Package Library or Install-Package TLDR change library code run “rake nuget” run “Update-Package library” in the consuming application build/test! If you manually execute any of this process, especially copying files, you will find it a burden to develop the library and will find yourself dreading it, and even worse, making changes downstream instead of updating the shared library for everyone’s sake. Publishing Once you have a set of changes that you want to release, consider versioning and possibly increment the minor version if needed. Pick the package out of your local feed, and copy it to a public / shared feed! I have a script to do this where I can drop the package on a batch file Replace apikey with your nuget feed's apikey Take out the confirm(s) if you don't want them @ECHO off echo Upload %1? set /P anykey="Hit enter to continue " nuget push %1 apikey set /P anykey="Done " Note: helps to prune all the unnecessary versions during testing from your local feed once you are done and ready to publish TLDR consider version number run command to copy to public feed

    Read the article

  • EPM Planning (Hyperion) V11.1.2 Implementation Hands-On Boot-camp

    - by Mike.Hallett(at)Oracle-BI&EPM
    5-Day Training for Partners: 29th October - 2nd November 2012, London (UK): REGISTER Here This FREE for Partners 5-day workshop is designed to provide implementation instruction on Oracle Hyperion EPM Planning.  This boot-camp is intended for prospective implementers of the Planning and Budgeting functionality of Oracle EPM or implementers that are currently familiar with the basics of EPM Planning and looking to strengthen their base of knowledge in the product. The class begins with an overview of Essbase, the foundation of Hyperion Planning. It provides a general overview of Planning and Planning terms, the architecture of all the Planning components, and how they are commonly used. The course goes over all the steps to create an application from scratch. This involves some preparation work outside of Planning and leads to developing the application in both the Planning Windows and Web clients. Participants will modify existing dimensions and build out the hierarchies using the Web client. Topics Covered The boot-camp shows developers how to build out dimensions using Classic Planning and by using EPMA. It covers the mechanics and cover strategies for automating the build process such as interface tables. It reviews data loads using Load Rules to load the Planning database. The course focuses on tasks that end-users must perform during the planning cycle. It walks students through creating and modifying forms, working with forms to enter data, adding annotations, and the rest of the form features such as running business rules and managing task lists. It covers how to use the forms in the Smart View client and finishes up the end-user perspective by going through Workflow Management and the process of submitting a plan for review. The final section of the course covers Security and other administration topics such as automation and deployment. Prerequisites Ideal participants are Oracle partners (SIs and resellers) with a background in business information systems and a clientele of customers with ongoing or prospective EPM initiatives. Alternatively, partners with the background described above and an interest in evolving their practice to a similar profile are suitable participants. Further online OPN guided learning path information and webinars are available at: Oracle Hyperion Planning 11 Essentials. Please note that attendees are required to bring a laptop. View here laptop requirements and detailed agenda. ·       REGISTER Here : acceptance is subject to availability and your place will be confirmed within two weeks  ( and for help see the Partner Registration Guide ). Training Location: Oracle Corporation UK Ltd Columbus Room Customer Visit Center 1 South Place London EC2M 2RB Training Dates: 29th October - 2nd November  9:30 am – 5:00 pm BST For more information please contact [email protected].

    Read the article

  • Solita Oy Achieves Oracle PartnerNetwork Specialization

    - by michaela.seika(at)oracle.com
    Helsinki, February 2, 2011 - Solita Oy, a member of the Oracle® PartnerNetwork (OPN), is the first Finnish enterprise to achieve OPN Specialized status for customer-specific systems integration and software solutions.To achieve a Specialized status, Oracle partners are required to meet a stringent set of requirements that are based on the needs and priorities of the customer and partner community. By achieving a Specialized distinction, Solita Oy has been recognized by Oracle for its expertise in customer-specific systems integration and software solutions, achieved through competency development and demonstrated by the company's business results and proven success in implementing customer projects. "Solita and Oracle have cooperated for a long time, and we have been an Oracle partner for many years. We believe that the renewed partner program and the new partnership level that we have achieved will open up new opportunities for a closer collaboration with Oracle. Our increased focus on systems integration solutions and the stepping up of our specialized knowledge of SOA will enable us to provide even better solutions for our customers," said Jari Niska, Chief Executive Officer, Solita Oy. "Solita has shown trust and belief in Oracle's technology and in the business opportunities arising with it. They have contributed to building our cooperation in a consistent and systematic way. Achieving a Specialized status in our partner program is a natural further step in our close and committed cooperation. It strengthens our trust in our ability to be able to increase both turnover and profitability together," said Juha Kaskirinne, Alliances and Channel Leader, Oracle Finland Oy.  About Oracle PartnerNetwork Oracle PartnerNetwork (OPN) Specialized is the latest version of Oracle's partner program that provides partners with tools to better develop, sell and implement Oracle solutions. OPN Specialized offers resources to train and support specialized knowledge of Oracle products and solutions and has evolved to recognize Oracle's growing product portfolio, partner base and business opportunity. Key to the latest enhancements to OPN is the ability for partners to differentiate through Specializations. Specializations are achieved through competency development, business results, expertise and proven success. To find out more visit http://www.oracle.com/partners or connect with the Oracle Partner community at OPN on Twitter, OPN on Facebook, OPN on LinkedIn, and OPN on YouTube. About Solita Oy Solita Oy is a Finnish company dedicated to developing demanding information system solutions and IT professional services. Solita's customers include prominent Finnish companies and public organizations. Solita's turnover in 2010 was about 17 million euros. The company was founded in 1996 and has over 170 employees. Further information: www.solita.fiFurther information Jari Niska, CEO, Solita Oy, tel. +358 40 524 6400, [email protected] Kaskirinne, A&C Leader Finland, Oracle Finland Oy, tel. +358 40 506 3592, [email protected]

    Read the article

  • Function Folding in #PowerQuery

    - by Darren Gosbell
    Originally posted on: http://geekswithblogs.net/darrengosbell/archive/2014/05/16/function-folding-in-powerquery.aspxLooking at a typical Power Query query you will noticed that it's made up of a number of small steps. As an example take a look at the query I did in my previous post about joining a fact table to a slowly changing dimension. It was roughly built up of the following steps: Get all records from the fact table Get all records from the dimension table do an outer join between these two tables on the business key (resulting in an increase in the row count as there are multiple records in the dimension table for each business key) Filter out the excess rows introduced in step 3 remove extra columns that are not required in the final result set. If Power Query was to execute a query like this literally, following the same steps in the same order it would not be overly efficient. Particularly if your two source tables were quite large. However Power Query has a feature called function folding where it can take a number of these small steps and push them down to the data source. The degree of function folding that can be performed depends on the data source, As you might expect, relational data sources like SQL Server, Oracle and Teradata support folding, but so do some of the other sources like OData, Exchange and Active Directory. To explore how this works I took the data from my previous post and loaded it into a SQL database. Then I converted my Power Query expression to source it's data from that database. Below is the resulting Power Query which I edited by hand so that the whole thing can be shown in a single expression: let     SqlSource = Sql.Database("localhost", "PowerQueryTest"),     BU = SqlSource{[Schema="dbo",Item="BU"]}[Data],     Fact = SqlSource{[Schema="dbo",Item="fact"]}[Data],     Source = Table.NestedJoin(Fact,{"BU_Code"},BU,{"BU_Code"},"NewColumn"),     LeftJoin = Table.ExpandTableColumn(Source, "NewColumn"                                   , {"BU_Key", "StartDate", "EndDate"}                                   , {"BU_Key", "StartDate", "EndDate"}),     BetweenFilter = Table.SelectRows(LeftJoin, each (([Date] >= [StartDate]) and ([Date] <= [EndDate])) ),     RemovedColumns = Table.RemoveColumns(BetweenFilter,{"StartDate", "EndDate"}) in     RemovedColumns If the above query was run step by step in a literal fashion you would expect it to run two queries against the SQL database doing "SELECT * …" from both tables. However a profiler trace shows just the following single SQL query: select [_].[BU_Code],     [_].[Date],     [_].[Amount],     [_].[BU_Key] from (     select [$Outer].[BU_Code],         [$Outer].[Date],         [$Outer].[Amount],         [$Inner].[BU_Key],         [$Inner].[StartDate],         [$Inner].[EndDate]     from [dbo].[fact] as [$Outer]     left outer join     (         select [_].[BU_Key] as [BU_Key],             [_].[BU_Code] as [BU_Code2],             [_].[BU_Name] as [BU_Name],             [_].[StartDate] as [StartDate],             [_].[EndDate] as [EndDate]         from [dbo].[BU] as [_]     ) as [$Inner] on ([$Outer].[BU_Code] = [$Inner].[BU_Code2] or [$Outer].[BU_Code] is null and [$Inner].[BU_Code2] is null) ) as [_] where [_].[Date] >= [_].[StartDate] and [_].[Date] <= [_].[EndDate] The resulting query is a little strange, you can probably tell that it was generated programmatically. But if you look closely you'll notice that every single part of the Power Query formula has been pushed down to SQL Server. Power Query itself ends up just constructing the query and passing the results back to Excel, it does not do any of the data transformation steps itself. So now you can feel a bit more comfortable showing Power Query to your less technical Colleagues knowing that the tool will do it's best fold all the  small steps in Power Query down the most efficient query that it can against the source systems.

    Read the article

  • Auto-run script when iPad plugged in

    - by oldmankit
    The way that Ubuntu handles documents on the iPad is awesome (without any configuration required). It beats windows, even after you install iTunes. I want to have the documents in certain iPad apps automatically synced into my Dropbox directory whenever the iPad is connected by USB. The syncing is easy; getting the script to run is not. I have already read the information in various (very out-of-date) tutorials. The best I could find was here: http://askubuntu.com/a/25091/16157 I used lsusb, with the following results: Bus 002 Device 012: ID 05ac:12a2 Apple, Inc. (Please note that when an iPad is connected, Ubuntu seems to mount it to two different mount points: one for "Documents" and one for the whole iPad filesystem. They are both mounted in ~/.gvfs) I have created the following file /etc/udev/rules.d/96-ipad_sync.rules ACTION=="add", ATTRS{idVendor}=="05ac", ATTRS{idProduct}=="12a2", RUN+="/home/kit/bin/jobdone2" I want it to run a test script (which sleeps for five seconds then plays an mp3 file. The test script works, and I have typed the location correctly). So far, when I plug the iPad in, nothing happens. Yes, I waited five seconds. This is the output I get from typing udevadm monitor –env KERNEL[29348.114010] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4 (usb) KERNEL[29348.114844] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:1.0 (usb) KERNEL[29348.129118] remove /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:1.0 (usb) KERNEL[29348.130699] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.0 (usb) KERNEL[29348.130845] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.1 (usb) KERNEL[29348.130909] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.2 (usb) UDEV [29348.163861] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4 (usb) UDEV [29348.170390] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:1.0 (usb) UDEV [29348.171521] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.1 (usb) UDEV [29348.172230] remove /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:1.0 (usb) UDEV [29348.172890] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.2 (usb) UDEV [29348.175645] add /devices/pci0000:00/0000:00:1d.0/usb2/2-1/2-1.4/2-1.4:4.0 (usb)

    Read the article

  • Error while removing the new kernel 2.6.37

    - by Tarek
    Hi! I tried to install the new kernel but something went wrong and I'm trying to remove it now. The error massege is: mhd@Tarek-Laptop:~$ sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: linux-image-2.6.37-020637-generic 0 upgraded, 0 newly installed, 1 to remove and 9 not upgraded. 1 not fully installed or removed. After this operation, 111MB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... 188780 files and directories currently installed.) Removing linux-image-2.6.37-020637-generic ... Examining /etc/kernel/postrm.d . run-parts: executing /etc/kernel/postrm.d/initramfs-tools 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic run-parts: executing /etc/kernel/postrm.d/zz-update-grub 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic /etc/default/grub: 33: Syntax error: EOF in backquote substitution run-parts: /etc/kernel/postrm.d/zz-update-grub exited with return code 2 Failed to process /etc/kernel/postrm.d at /var/lib/dpkg/info/linux-image-2.6.37-020637-generic.postrm line 328. dpkg: error processing linux-image-2.6.37-020637-generic (--remove): subprocess installed post-removal script returned error exit status 1 Errors were encountered while processing: linux-image-2.6.37-020637-generic E: Sub-process /usr/bin/dpkg returned an error code (1) The previous unsloved error is on this bug. This is my grub configuration file: # If you change this file, run 'update-grub' afterwards to update # /boot/grub/grub.cfg. GRUB_DEFAULT=0 #GRUB_HIDDEN_TIMEOUT=0 GRUB_HIDDEN_TIMEOUT_QUIET=true GRUB_TIMEOUT=10 GRUB_DISTRIBUTOR=`lsb_release -i -s 2> /dev/null || echo Debian` RUB_CMDLINE_LINUX_DEFAULT="quiet splash nomodeset video=uvesafb:mode_option=1024x768-24,mtrr=3,scroll=ywrap" video=uvesafb:mode_option=>>1024x768-24<<,mtrr=3,scroll=ywrap" GRUB_CMDLINE_LINUX=" vga=792 splash" # Uncomment to enable BadRAM filtering, modify to suit your needs # This works with Linux (no patch required) and with any kernel that obtains # the memory map information from GRUB (GNU Mach, kernel of FreeBSD ...) #GRUB_BADRAM="0x01234567,0xfefefefe,0x89abcdef,0xefefefef" # Uncomment to disable graphical terminal (grub-pc only) #GRUB_TERMINAL=console # The resolution used on graphical terminal # note that you can use only modes which your graphic card supports via VBE # you can see them in real GRUB with the command `vbeinfo' GRUB_GFXMODE=1024x768-24 # Uncomment if you don't want GRUB to pass "root=UUID=xxx" parameter to Linux #GRUB_DISABLE_LINUX_UUID=true # Uncomment to disable generation of recovery mode menu entries #GRUB_DISABLE_LINUX_RECOVERY="true" # Uncomment to get a beep at grub start #GRUB_INIT_TUNE="480 440 1" thank you for answering.

    Read the article

  • Friday Tips #3

    - by Chris Kawalek
    Even though yesterday was Thanksgiving here in the US, we still have a Friday tip for those of you around your computers today. In fact, we have two! The first one came in last week via our #AskOracleVirtualization Twitter hashtag. The tweet has disappeared into the ether now, but we remember the gist, so here it is: Question: Will there be an Oracle Virtual Desktop Client for Android? Answer by our desktop virtualization product development team: We are looking at Android as a supported platform for future releases. Question: How can I make a Sun Ray Client automatically connect to a virtual machine? Answer by Rick Butland, Principal Sales Consultant, Oracle Desktop Virtualization: Someone recently asked how they can assign VM’s to specific Sun Ray Desktop Units (“DTU’s”) without any user interfaction being required, without the “Desktop Selector” being displayed, or any User Directory.  That is, they wanted each Sun Ray to power on and immediately connect to a pre-assigned Solaris VM.   This can be achieved by using “tokens” for user assignment – that is, the tokens found on Smart Cards, DTU’s, or OVDC clients can be used in place of user credentials.  Note, however, that mixing “token-only” assignments and “User Directories” in the same VDI Center won’t work.   Much of this procedure is covered in the documentation, particularly here. But it can useful to have everything in one place, “cookbook-style”:  1. Create the “token-only” directory type: From the VDI administration interface, select:  “Settings”, “Company”, “New”, select the “None” radio button, and click “Next.” Enter a name for the new “Company”, and click “Next”, then “Finish.” 2. Create Desktop Providers, Pools, and VM’s as appropriate. 3. Access the Sun Ray administration interface at http://servername:1660 and login using “root” credentials, and access the token-id’s you wish to use for assignment.  If you’re using DTU tokens rather than Smart Card tokens, these can be found under the “Tokens” tab, and “Search-ing” using the “Currently Used Tokens” tab.  DTU’s can be identified by the prefix “psuedo.”   For example: 4. Copy/paste this token into the VDI administrative interface, by selecting “Users”, “New”, and pasting in the token ID, and click “OK” - for example: 5. Assign the token (DTU) to a desktop, that is, in the VDI Admin Gui, select “Pool”, “Desktop”, select the VM, and click "Assign" and select the token you want, for example: In addition to assigning tokens to desktops, you'll need to bypass the login screen.  To do this, you need to do two things:  1.  Disable VDI client authentication with:  /opt/SUNWvda/sbin/vda settings-setprops -p clientauthentication=Disabled 2. Disable the VDI login screen – to do this,  add a kiosk argument of "-n" to the Sun Ray kiosk arguments screen.   You set this on the Sun Ray administration page - "Advanced", "Kiosk Mode", "Edit", and add the “-n” option to the arguments screen, for example: 3.  Restart both the Sun Ray and VDI services: # /opt/SUNWut/sbin/utstart –c # /opt/SUNWvda/sbin/vda-service restart Remember, if you have a question for us, please post on Twitter with our hashtag (again, it's #AskOracleVirtualization), and we'll try to answer it if we can. See you next time!

    Read the article

  • Database unit testing is now available for SSDT

    - by jamiet
    Good news was announced yesterday for those that are using SSDT and want to write unit tests, unit testing functionality is now available. The announcement was made on the SSDT team blog in post Available Today: SSDT—December 2012. Here are a few thoughts about this news. Firstly, there seems to be a general impression that database unit testing was not previously available for SSDT – that’s not entirely true. Database unit testing was most recently delivered in Visual Studio 2010 and any database unit tests written therein work perfectly well against SQL Server databases created using SSDT (why wouldn’t they – its just a database after all). In other words, if you’re running SSDT inside Visual Studio 2010 then you could carry on freely writing database unit tests; some of the tight integration between the two (e.g. right-click on an object in SQL Server Object Explorer and choose to create a unit test) was not there – but I’ve never found that to be a problem. I am currently working on a project that uses SSDT for database development and have been happily running VS2010 database unit tests for a few months now. All that being said, delivery of database unit testing for SSDT is now with us and that is good news, not least because we now have the ability to create unit tests in VS2012. We also get tight integration with SSDT itself, the like of which I mentioned above. Having now had a look at the new features I was delighted to find that one of my big complaints about database unit testing has been solved. As I reported here on Connect a refactor operation would cause unit test code to get completely mangled. See here the before and after from such an operation: SELECT    * FROM    bi.ProcessMessageLog pml INNER JOIN bi.[LogMessageType] lmt     ON    pml.[LogMessageTypeId] = lmt.[LogMessageTypeId] WHERE    pml.[LogMessage] = 'Ski[LogMessageTypeName]of message: IApplicationCanceled' AND        lmt.[LogMessageType] = 'Warning'; which is obviously not ideal. Thankfully that seems to have been solved with this latest release. One disappointment about this new release is that the process for running tests as part of a CI build has not changed from the horrendously complicated process required previously. Check out my blog post Setting up database unit testing as part of a Continuous Integration build process [VS2010 DB Tools - Datadude] for instructions on how to do it. In that blog post I describe it as “fiddly” – I was being kind when I said that! @Jamiet

    Read the article

  • Cloud hosted CI for .NET projects

    - by Scott Dorman
    Originally posted on: http://geekswithblogs.net/sdorman/archive/2014/06/02/cloud-hosted-ci-for-.net-projects.aspxContinuous integration (CI) is important. If you don’t have it set up…you should. There are a lot of different options available for hosting your own CI server, but they all require you to maintain your own infrastructure. If you’re a business, that generally isn’t a problem. However, if you have some open source projects hosted, for example on GitHub, there haven’t really been any options. That has changed with the latest release of AppVeyor, which bills itself as “Continuous integration for busy developers.” What’s different about AppVeyor is that it’s a hosted solution. Why is that important? By being a hosted solution, it means that I don’t have to maintain my own infrastructure for a build server. How does that help if you’re hosting an open source project? AppVeyor has a really competitive pricing plan. For an unlimited amount of public repositories, it’s free. That gives you a cloud hosted CI system for all of your GitHub projects for the cost of some time to set them up, which actually isn’t hard to do at all. I have several open source projects (hosted at https://github.com/scottdorman), so I signed up using my GitHub credentials. AppVeyor fully supported my two-factor authentication with GitHub, so I never once had to enter my password for GitHub into AppVeyor. Once it was done, I authorized GitHub and it instantly found all of the repositories I have (both the ones I created and the ones I cloned from elsewhere). You can even add “build badges” to your markdown files in GitHub, so anyone who visits your project can see the status of the lasted build. Out of the box, you can simply select a repository, add the build project, click New Build and wait for the build to complete. You now have a complete CI server running for your project. The best part of this, besides the fact that it “just worked” with almost zero configuration is that you can configure it through a web-based interface which is very streamlined, clean and easy to use or you can use a appveyor.yml file. This means that you can define your CI build process (including any scripts that might need to be run, etc.) in a standard file format (the YAML format) and store it in your repository. The benefits to that are huge. The file becomes a versioned artifact in your source control system, so it can be branched, merged, and is completely transparent to anyone working on the project. By the way, AppVeyor isn’t limited to just GitHub. It currently supports GitHub, BitBucket, Visual Studio Online, and Kiln. I did have a few issues getting one of my projects to build, but the same day I posted the problem to the support forum a fix was deployed, and I had a functioning CI build about 5 minutes after that. Since then, I’ve provided some additional feature requests and had a few other questions, all of which have seen responses within a 24-hour period. I have to say that it’s easily been one of the best customer support experiences I’ve seen in a long time. AppVeyor is still young, so it doesn’t yet have full feature parity with some of the older (more established) CI systems available,  but it’s getting better all the time and I have no doubt that it will quickly catch up to those other CI systems and then pass them. The bottom line, if you’re looking for a good cloud-hosted CI system for your .NET-based projects, look at AppVeyor.

    Read the article

  • Tech Cast Live - Java and Oracle, One Year Later - February 15th 10AM PST

    - by Cassandra Clark
    Join us for a special live conversation with Ajay Patel, Vice President of Product Development for Application Grid Products and Justin Kestelyn, Director of the Oracle Technology Network. Justin and Ajay will discuss the changes that have come to Java and Oracle since the Sun acquisition, just over a year ago. This live broadcast conversation will include discussion on: - Highlights, challenges and what we learned over the past year - The Future of Java and its importance to Oracle and the community - Oracle's Application Grid product portfolio today Watch Live Event February 15th Watch Archived TechCast Lives You will also have the chance to submit questions to the speakers live on the show, for real-time feedback by using #techcastlive. If your question is read on air we will send you a Free I am the Future of Java t-shirt* *Promotion Details After you have submitted your question and it is read on the live TechCast held February 15th your shirt should arrive in two to four weeks while supplies last. No purchase, payments, or fees are required to receive the gift. Limit one thank you gift per person, and the offer is available only while supplies last. Oracle reserves the right to modify or terminate this offer at any time, for any reason. This offer is not available to Oracle employees or residents of countries subject to U.S. embargo (including Cuba, Iran, Iraq, Libya, North Korea, Sudan, and Syria). Due to Federal Government regulations, this offer is not available to Federal Government customers. Those residing in India or Brazil will be given a substitute gift as we can not ship t-shirts to your country. You are responsible for complying with your employer's policies regarding acceptance of promotional items, and for government laws, regulations and agency policies, if you are a government employee you will not be able to participate. Must be 18 years of age or older. Void where prohibited. Neither Oracle nor any third party assisting Oracle with this offer is responsible for any problems, errors, delays, or technical malfunction related to or impacting this offer. Oracle respects your right to privacy and your information will not be distributed or used for any other purpose. For more information on Oracle's privacy policy, please review our http://www.oracle.com/html/privacy-policy.html. If you have any questions, please contact us at [email protected].

    Read the article

  • SharePoint 2010 Hosting :: How to Enable Office Web Apps on SharePoint 2010

    - by mbridge
    Office Web App is the online version of Microsoft Office 2010 which is very helpful if you are going to use SharePoint 2010 in your organization as it allows you to do basic editing of word document without installing the Office Suite in the client machine. Prerequisites : - Microsoft Server 2008 R2 - Microsoft SharePoint Server 2010 or Microsoft SharePoint Foundation 2010 - Microsoft Office Web Apps. If you have installed all the above products, just follow this steps: 1. Go to Central Administration > Click on Manage Service Application. 2. All the menus are not displayed in ribbon Menu format which was first introduced in Office 2007. Click on New > Word Viewing Services ( You can choose PowerPoint or Excel also, steps are same ). This will open a pop window. Adding Services for Office Web Apps 3. Give a Proper Name which can have your companies or project name. 4. Under Application Pool select : SharePoint Web Services Default. 5. Next keep the check box checked which says : Add this service application’s proxy to the farm’s default proxy list. Click Ok Adding Word Viewer as Service Application Office Web Apps as Services in Sharepoint 2010 6. This will install all the Office Web App services required. You can see the name as you gave in the above step. How to Activate Office Web Apps in Site Collection? 1. Go to the site for which you want to activate this feature. 2. Click on Site Action > Site Settings > Site Collection Administrator > Site Collection Features 3. Activate Office Web Apps. Activate Office Web Apps Feature in Site Collection How to make sure Office Web Apps is working for your site collection? 1. Locate any office document you have and click on the smart menu which appears when you hover your mouse on it. Dont double-click as this will launch the document in Office Client if its installed. This feature can be changed. 2. If you see View or Edit in Browser as menu item, your Office Web Apps is configured correctly. View Edit Office Document in Browser Editing Office Document in Browser Another post related SharePoint 2010: 1. How to Configure SharePoint Foundation 2010 for SharePoint Workspace 2010 2. Integrating SharePoint 2010 and SQL 2008 R2

    Read the article

  • Apache doesn't load .php files

    - by Haddex
    First, sorry for my English and asking something that it's quite answered all over the web. I've read a lot of post about this problem but I still can't find the solution. I'm a web developer who recently moved to Ubuntu from Windows 7. I had a website done (it's online and working) and I set up LAMP to keep working with it. I made a test.php file with: <?php phpinfo(); ?> and put it on /var/www/html directory, it shows all the information about the php and I was really happy: "Ok, it's all done, tomorrow I will work hard" But I placed my whole web into /var/www/html , not in a folder, the index.php is in /var/www/html but guess what: doesn't load any of my .php files, the browser just keep thinking. What I did: I rebooted Apache: /etc/init.d/apache2 restart I tried again with the test.php file and it works fine I put in /var/www/html a .html file and works fine. I looked for /etc/apache2/sites-enable/000-default.conf and it says: DocumentRoot /var/www/html I looked for /etc/apache2/mods-enabled/dir.conf and it says: DirectoryIndex index.html index.cgi index.pl index.php ... Edit* I think it's something related to phpmyadmin, like if I'm not able to connect with the database. But I got nothing on the screen when trying to load the page so...I'm not sure. I can access to the url localhost/phpmyadmin and I edited the connection.php file like this: <?php # FileName="Connection_php_mysql.htm" # Type="MYSQL" # HTTP="true" $hostname_rakstadconnection = "localhost"; $database_rakstadconnection = "rakstadclandb"; $username_rakstadconnection = "root"; $password_rakstadconnection = "admin"; $rakstadconnection = mysql_connect($hostname_rakstadconnection, $username_rakstadconnection, $password_rakstadconnection) or trigger_error(mysql_error(),E_USER_ERROR); mysql_query("SET NAMES 'utf8'"); ?> The name of the database is correct, like the user and password. http://i89.photobucket.com/albums/k220/Haddex/Capturadepantallade2014-06-09112609_zpsc45ddb72.png http://i89.photobucket.com/albums/k220/Haddex/Capturadepantallade2014-06-09112120_zps0b9e15f7.png *Edit2: could this be because it's a website that I brought to Linux from Windows? I used Dreamweaver. Edit3: I changed the # to /*/, nothing. The error.log file says: [Mon Jun 09 17:08:13.627881 2014] [:error] [pid 1517] [client 127.0.0.1:46663] PHP Warning: require_once(/var/www/html/Connections/rakstadconnection.php): failed to open stream: Permission denied in /var/www/html/index.php on line 1 [Mon Jun 09 17:08:13.627933 2014] [:error] [pid 1517] [client 127.0.0.1:46663] PHP Fatal error: require_once(): Failed opening required 'Connections/rakstadconnection.php' (include_path='.:/usr/share/php:/usr/share/pear') in /var/www/html/index.php on line 1 I'm reading error log but...should I add a linux path into a my index.php file? Don't think so. Thanks.

    Read the article

  • Partner Webcast – Oracle SOA Suite 12c: Connect 4 Cloud, Mobile, IoT with On-premise - August 28th 2014

    - by JuergenKress
    Thursday August 28th 2014 SOA Suite 12c Webcast The pace of new business projects continues to grow from increasing customer self-service to seamlessly connecting all your back office and in-the-field applications. At the same time increased integration complexity may seem inevitable as organizations are suddenly faced with the requirement to support three new integration challenges: » Cloud Integration - integrate with the cloud, rapidly integrate a growing list of cloud applications with existing applications » Mobile Integration - the urgency to mobile-enable existing applications » IoT Integration - begin development on the latest trend of connecting Internet of Things (IoT) devices to your existing infrastructure. Join this webcast to get an overview of what is in Java 8 from a business perspective and how with Java 8, you are uniquely positioned to extend innovation in your solutions through the largest, open, standards-based, community-driven platform. Oracle SOA Suite 12c Oracle SOA Suite 12c, the latest version of the industry’s most complete and unified application integration and SOA solution, aims to simplify, accelerate and optimize integrations. Oracle SOA Suite 12c and its associated products, Oracle Managed File Transfer, Oracle Cloud and Application Adapters, B2B and healthcare integration, offer the industry’s most highly integrated platform for solving the increased integration challenges. Oracle SOA Suite 12c is a complete, integrated and best-of-breed platform. It enables next generation integration capabilities through A unified toolset for the development of services and composite applications. A standards-based platform that is service enabled and easily consumable by modern web applications, allowing enterprises to quickly and easily adapt to changes in their business and IT environments. Greater visibility, controls and analytics to govern how services and processes are deployed, reused and changed across their entire lifecycle. Join us to find out more about the new features of Oracle SOA Suite 12c and how it enables you to reduce time to market for new project integration and to reduce integration cost and complexity. Oracle SOA Suite is the ability to simplify by integrating the disparate requirements of cloud, mobile, and IoT devices with existing on-premise applications. Agenda: Oracle SOA Suite 12c new Features Cloud Integration Mobile Enablement Interent of Things (IoT) Summary - Q&A For details please visit our registration page here. Thursday, Aug 28th 2014 10am CET  (9am GMT / 11am EEST SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: SOA Suite 12c,Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress,SOA

    Read the article

  • Why is my dual-boot Ubuntu partition showing up as a peripheral "root.disk"?

    - by Don
    I recently installed Ubuntu 12.04, which I had been booting from a usb key, as a dual-boot on my machine running Windows 7. From what I had read online while researching, I was prepared to have to shrink the Windows partition and all that. But I never had to - it really was just a few clicks here and there and it was installed. I'm still pretty confused about it, but whatever, it worked, and the two peacefully coexist on my machine, and I have broken things to fix before I worry about fixing unbroken things. So yesterday I got it in my head to look at my partitions (I was considering making an all new partition to install the Windows 8 Release Preview). What I saw confused me. Here's a screenshot of the disk utility. At this moment, there is nothing connected to my computer, and nothing in any of the optical drives/ports/card readers/etc. Can you help me figure out what's going on here? Don's Machine is, I believe, my Windows partition - that's the name I assigned my machine from Windows Explorer. PQSERVICE is from what I can find online also Windows, but having to do with backup. And SYSTEM REQUIRED, if I browse it in Ubuntu, is definitely something to do with booting, and I believe it is also Windows'. According to the sizes shown, those three together should use up my 500 GB HD. Then further down, as a "peripheral device", it lists that 31 GB disk. This is obviously my Ubuntu (Model:Linux Loop:root.disk), but why is it showing up as a peripheral? So, to sum up those questions and to add some more random ones I had: Why is Ubuntu showing up as a peripheral device? If the Windows sections take up all 500 GB, where does Ubuntu live? If I renamed the disk partitions, would my life become a nightmare (seriously - can I safely rename them)? Why didn't I have to resize the Windows partition in the first place? Would giving Ubuntu more space improve its performance (it hangs alot)? Is it possible to have a partition for each OS (Windows 7 & 8, Ubuntu), a partition for files, and a separate partition for backups? Is this towards the good or bad idea end of the spectrum? @Elfy, would that explain why it keeps hanging? I guess I'll backup my files, rip it out, and reinstall it correctly later on today.

    Read the article

  • Will HTML5 make Silverlight redundant?

    - by Laila
    One of the great features of Adobe AIR v2 that was launched this month was its support for some of the 2008 draft of HTML5. The HTML5 specification was started in 2004, but the full spec will probably not be approved by W3C until around 2022. One might have thought that it would take years yet from now to reach the point where any browsers were remotely HTML5-compliant, but enough of HTML5 is published and agreed to make a lot of it possible, and Safari and Adobe have got there thanks to Apple's open-source WebKit. The race for HTML 5 has been fuelled by the demand by Apple and Google for advanced graphics, typography, animations and transitions without having to rely on third party browser plug-ins such as Adobe Flash or Silverlight. There is good reason for this haste: Flash doesn't support touch-devices and has been slow in supporting hardware video decoders such as H.264. There is a strong requirement to do all that Flash can do in an open-standards way. Those with proprietary solutions remain sniffy. In AIR 2, Adobe pointedly disables the HTML5 and tags that allow basic playing of media content, saying that the specification is not final and there is still no standard for the supported formats, and adding that Safari implements a 'disjoint set' of codecs. Microsoft also has little interest in HTML 5 as it has so much invested in Silverlight. Google stands to gain by the Adobe AIR for Android as it will allow a lot of applications to be migrated easily to the platform, so sees Apple's war on Flash as a way of gaining market share. Why do we care? It is because HTML5/CSS3 provides facilities much far beyond HTML4, bring the reality of browser-based applications a lot closer. Probably most generally useful is the advanced typography: Safari and AIR already both support a way of reflowing text in a container across an arbitrary number of columns; Page-specific fonts can also be specified. Then there is 2D drawing, video, transitions, local storage, AJAX navigation and mutable DOM prototypes. HTML5 is likely to provide base functionality that is required but it is too early to be certain that it will render Flash, Silverlight or JavaFX obsolete. In the meantime, Adobe Air provides the best vehicle for developing HTML5/CSS3 applications without a twinge of worry about browser incompatibilities. Cheers, Laila

    Read the article

  • How do I stop icons appearing on the desktop under conky?

    - by Seamus
    When I download something to my desktop, or insert a CD or flash drive, the icon appears on my desktop. When I have conky running, the icon sometimes appears in the top right corner, underneath conky; where I can't see it. How do I stop this happening? My .conkyrc is pasted below. I didn't write it all myself, so I'm not entirely sure what I need to change, or what parts are relevant for this particular question... # UBUNTU-CONKY # A comprehensive conky script, configured for use on # Ubuntu / Debian Gnome, without the need for any external scripts. # # Based on conky-jc and the default .conkyrc. # INCLUDES: # - tail of /var/log/messages # - netstat shows number of connections from your computer and application/PID making it. Kill spyware! # # -- Pengo # # Create own window instead of using desktop (required in nautilus) own_window yes own_window_type override own_window_transparent yes own_window_hints undecorated,below,sticky,skip_taskbar,skip_pager # Use double buffering (reduces flicker, may not work for everyone) double_buffer yes # fiddle with window use_spacer right # Use Xft? use_xft yes xftfont DejaVu Sans:size=8 xftalpha 0.8 text_buffer_size 2048 # Update interval in seconds update_interval 3.0 # Minimum size of text area # minimum_size 250 5 # Draw shades? draw_shades no # Text stuff draw_outline no # amplifies text if yes draw_borders no uppercase no # set to yes if you want all text to be in uppercase # Stippled borders? stippled_borders 3 # border margins border_margin 9 # border width border_width 10 # Default colors and also border colors, grey90 == #e5e5e5 default_color grey own_window_colour brown own_window_transparent yes # Text alignment, other possible values are commented #alignment top_left alignment top_right #alignment bottom_left #alignment bottom_right # Gap between borders of screen and text gap_x 10 gap_y 20 # stuff after 'TEXT' will be formatted on screen TEXT $color ${color orange}SYSTEM ${hr 2}$color $nodename $sysname $kernel on $machine ${color orange}CPU ${hr 2}$color ${freq}MHz Load: ${loadavg} Temp: ${acpitemp} $cpubar ${cpugraph 000000 ffffff} NAME ${goto 150}PID ${goto 200}CPU% ${goto 250}MEM% ${top name 1} ${goto 150}${top pid 1} ${goto 200}${top cpu 1} ${goto 250}${top mem 1} ${top name 2} ${goto 150}${top pid 2} ${goto 200}${top cpu 2} ${goto 250}${top mem 2} ${top name 3} ${goto 150}${top pid 3} ${goto 200}${top cpu 3} ${goto 250}${top mem 3} ${top name 4} ${goto 150}${top pid 4} ${goto 200}${top cpu 4} ${goto 250}${top mem 4} ${color orange}MEMORY / DISK ${hr 2}$color RAM: $memperc% ${membar 6}$color Swap: $swapperc% ${swapbar 6}$color Home: ${fs_free_perc /home}% ${fs_bar 6 /}$color Free Space: ${fs_free /home} ${color orange}NETWORK (${addr eth0}) ${hr 2}$color Down: $color${downspeed eth0} k/s ${alignr}Up: ${upspeed eth0} k/s ${downspeedgraph eth0 25,140 000000 ff0000} ${alignr}${upspeedgraph eth0 25,140 000000 00ff00}$color Total: ${totaldown eth0} ${alignr}Total: ${totalup eth0} ${execi 30 netstat -ept | grep ESTAB | awk '{print $9}' | cut -d: -f1 | sort | uniq -c | sort -nr} ${color orange}WIRELESS (${addr wlan0}) ${hr 2}$color Down: $color${downspeed wlan0} k/s ${alignr}Up: ${upspeed wlan0} k/s ${downspeedgraph wlan0 25,140 000000 ff0000} ${alignr}${upspeedgraph wlan0 25,140 000000 00ff00}$color Total: ${totaldown wlan0} ${alignr}Total: ${totalup wlan0} ${execi 30 netstat -ept | grep ESTAB | awk '{print $9}' | cut -d: -f1 | sort | uniq -c | sort -nr}

    Read the article

  • Single Responsibility Principle Implementation

    - by Mike S
    In my spare time, I've been designing a CMS in order to learn more about actual software design and architecture, etc. Going through the SOLID principles, I already notice that ideas like "MVC", "DRY", and "KISS", pretty much fall right into place. That said, I'm still having problems deciding if one of two implementations is the best choice when it comes to the Single Responsibility Principle. Implementation #1: class User getName getPassword getEmail // etc... class UserManager create read update delete class Session start stop class Login main class Logout main class Register main The idea behind this implementation is that all user-based actions are separated out into different classes (creating a possible case of the aptly-named Ravioli Code), but following the SRP to a "tee", almost literally. But then I thought that it was a bit much, and came up with this next implementation class UserView extends View getLogin //Returns the html for the login screen getShortLogin //Returns the html for an inline login bar getLogout //Returns the html for a logout button getRegister //Returns the html for a register page // etc... as needed class UserModel extends DataModel implements IDataModel // Implements no new methods yet, outside of the interface methods // Haven't figured out anything special to go here at the moment // All CRUD operations are handled by DataModel // through methods implemented by the interface class UserControl extends Control implements IControl login logout register startSession stopSession class User extends DataObject getName getPassword getEmail // etc... This is obviously still very organized, and still very "single responsibility". The User class is a data object that I can manipulate data on and then pass to the UserModel to save it to the database. All the user data rendering (what the user will see) is handled by UserView and it's methods, and all the user actions are in one space in UserControl (plus some automated stuff required by the CMS to keep a user logged in or to ensure that they stay out.) I personally can't think of anything wrong with this implementation either. In my personal feelings I feel that both are effectively correct, but I can't decide which one would be easier to maintain and extend as life goes on (despite leaning towards Implementation #1.) So what about you guys? What are your opinions on this? Which one is better? What basics (or otherwise, nuances) of that principle have I missed in either design?

    Read the article

  • ArchBeat Link-o-Rama for July 2, 2013

    - by Bob Rhubart
    One Week To Go: OTN Architect Day: Cloud Computing - July 9, 2013, Redwood Shores, CA. The first OTN Architect Day event of 2013 happens in just one week, on Tuesday July 9 at the Oracle Conference Center in Redwood Shores, CA. Registration is free and you get three sessions by three experts on cloud computing in the real world — plus a panel Q&A for answers to all of your questions. Register now! Oracle Database 12c: Flashback Moving Forward | Lucas Jellema Oracle ACE Director Lucas Jellema's latest of several recent blog posts dealing with various aspects of the recently released Oracle Database 12c. Detroit, Embracing New Auto Technologies, Seeks App Builders This story from the New York Times paints a rosy picture indeed for app developers as the internet of things continues to evolve. Advanced View Criteria Implementation in ADF BC | Andrejus Baranovskis Oracle ACE Director Andrejus Baranovskis' post focuses on advanced declarative View Criteria features. JDeveloper: Showing a Popup when Selecting an af:selectOneRadio | Timo Hahn Oracle ACE Timo Hahn illustrates a use case in which a popup is displayed each time the user clicks on one of the radio buttons of a button group. Can Technology Innovation Save The New York Times? One of the standout keynotes from the recent QCon New York event, this presentation by New York Times Sr. VP/CIO Marc Frons and CTO/VP Rajiv Pant paints a detailed portrait of the complete transformation of an organization -- not just the IT. Enterprise architects will find this particularly interesting. Video: Meet Growing IT Demand for Databases with Private DBaaS Do you understand the difference between traditional database deployment and database as a service? If not, you'll want to check out this video, which includes an overview of Oracle Enterprise Manager's capabilities for rapid deployment of DBaaS. S Webcast: Zero-Downtime Migration to Oracle Exadata Using Oracle GoldenGate: A Customer Case Study Presenters Alok Pareek (VP, Product Management/Development, Oracle Data Integration) and John F. Martin (CEO of Emerging Markets and CTO IQNavigator) discuss how IQNavigator is using Oracle GoldenGate with Oracle Exadata. Free eBook: Building a Database Cloud for Dummies This free quick-reference guide, organized into six short chapters and supplemented with helpful illustrations, provides a clear overview of the cloud and step-by-step instructions on deploying database as a service. (Registration required.) Thought for the Day "My motto is: Live every day to the fullest – in moderation." — Lindsay Lohan (Born July 2, 1986) Source: brainyquote.com

    Read the article

< Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >