Search Results

Search found 26124 results on 1045 pages for 'unreal development kit'.

Page 733/1045 | < Previous Page | 729 730 731 732 733 734 735 736 737 738 739 740  | Next Page >

  • Why eclipse is hanging while in debug mode ?

    - by Pratik
    We are developing our web application using JAVA GWT framework. We are using Eclipse Indigo as a development GUI. We are facing problems while debugging the JAVA gwt application in eclipse. Most of the time, Eclipse hangs while debugging. We tried to increase the memory buffer size in eclipse but no luck. We had tried to run the eclipse in various environment like Windows, Fedora 16, Cent OS. but some how not getting positive results. Can anyone help me out to decide which OS, and eclipse or version should we have to use so can able to resolve the hanging issue? Thanks in advance. Pratik

    Read the article

  • VADs (Value Added Distributors) Oracle em Portugal

    - by Paulo Folgado
    Com a recente incorporação da Sun na Oracle, e o consequente acolhimento no seu canal de revenda dos distribuidores de Hardware (designados até então pela Sun por CDP - Channel Development Provider), a Oracle aproveitou para fazer, a nível global, uma reformulação do seu canal de distribuição.Essa reformulação pretendeu alcançar vários objectivos: Uniformizar as condições comerciais e de processos entre os CDPs Sun agora incorporados e os VAD Oracle já existentes Reduzir o número total de VADs a nível global Dar preferência a VADs com operações internacionais, em detrimento das operações puramente locais num só país Conceder a cada um dos VADs seleccionados a distribuição de todas as linhas de produtos Oracle, incluindo Software e Hardware.Assim, em resultado dessa reformulação, temos o prazer de anunciar que a Oracle Portugal passa a operar com os dois seguintes VADs: Cada um destes VADs passa a distribuir indistintamente, como acima foi referido, as linhas de produtos Software e Hardware. Para mais detalhes sobre as 2 empresas e os respectivos contactos, favor consultar em: http://blogs.oracle.com/opnportugal/vad/vad.html. Estamos certos que esta reformulação virá contribuir para uma ainda maior dinamização do ecosistema de parceiros da Oracle Portugal.

    Read the article

  • Cross-Browser Extension Installation now Possible with Opera and Google Chrome

    - by Akemi Iwaya
    People have been curious if there would be cross-browser compatibility for extensions due to Opera’s recent switch to the browser engine that Google Chrome uses. That question has now been answered. The OMG! Chrome! Blog has put together a nice tutorial on how to get cross-browser extension compatibility set up and working with your browser of choice. Screenshot courtesy of OMG! Chrome! Blog. While it is not surprising that the first steps in cross-browser extension compatibility have been taken, it will be interesting to see how it develops as the process is refined and further development occurs with the ‘new’ Opera. What are your thoughts on this? Is cross-browser extension compatibility really that important? Perhaps you feel that it does not matter? Let us know your thoughts in the comments!    

    Read the article

  • Check your Embed Interop Types flag when doing Visual Studio extensibility work

    - by Daniel Cazzulino
    In case you didn’t notice, VS2010 adds a new property to assembly references in the properties window: Embed Interop Types: This property was introduced as a way to overcome the pain of deploying Primary Interop Assemblies. Read that blog post, it will help understand why you DON’T need it when doing VS extensibility (VSX) work. It's generally advisable when doing VSX development NOT to use Embed Interop Types, which is a feature intended mostly for office PIA scenarios where the PIA assemblies are HUGE and had to be shipped with your app. This is NEVER the case with VSX authoring. All interop assemblies you reference (EnvDTE, VS.Shell, etc.) are ALWAYS already there in the users' machine, and you NEVER need to distribute them. So embedding those types only increases your assembly size without a single benefit to you (the extension developer/author).... Read full article

    Read the article

  • Backend devs put down by user stories

    - by Szili
    I planned to slice in backend development into to the user stories vertically. But a backend guy on our team started to complain that this makes their work invisible. My answer was that at the sprint planning and review meetings we discuss backend tasks in front of stakeholders so it makes it visible, and maintaining a high quality during the project will result a slower startin pace than other teams, but we will have a stable velocity during the project. And velocity is highly visible to stakeholders. He still insist having stories like: "As a developer I need to have a domain layer so I can encapsulate business logic." How can I solve the issue before it pollutes the team? The root of the issue is that our management systematically consider backend work as invisible and call backed devs miners, or other pejorative terms.

    Read the article

  • Oracle Response to Apache Departure from JCP

    - by Henrik Ståhl
    Last month Oracle renominated Apache to the Java Executive Committee because we valued their active participation and perspective on Java. Earlier this week, by an overwhelming majority, the Java Executive Committee voted to move Java forward by formally initiating work on both Java SE 7 and SE 8 based on their technical merits. Apache voted against initiating technical committee work on both SE 7 and SE 8, effectively voting against moving Java forward. Now, despite supporting the technical direction, Apache have announced that they are quitting the Executive Committee. Oracle has a responsibility to move Java forward and to maintain the uniformity of the Java standard for the millions of Java developers and the majority of Executive Committee members agree. We encourage Apache to reconsider its position and remain a part of the process to move Java forward. ASF and many open source projects within it are an important part of the overall Java ecosystem. Adam Messinger, Vice President of Development

    Read the article

  • Who is Configuration Manager?

    - by altern
    I would like to ask members of the community about the role of Configuration Manager, as you see it. I'm not asking what Configuration Management is, as long it had been asked before. What I need to know is: What tasks do you think Configuration Manager should perform (or performs) in your team? What is primary responsibility of Configuration Manager? What are secondary/auxiliary responsibilities of Configuration Manager? Does Configuration Manager need to be in charge of development processes on the project/company or he should be told what to do? What are relations between Configuration Manager, Build Manager, Release Manager, Deployment Engineer, CI Engineer roles? Aren't they all the same - Configuration Management? Maybe term Configuration Management is redundant and Technical/Team Lead should do all the related work instead? It would be really great if you could share your vision and experience.

    Read the article

  • Conditional AddHandler Directive

    - by Itai
    Is it possible to conditionally call AddHandler in the .htaccess under Apache (2.x)? My present situation requires that a certain AddHandler is needed by one production server but that one breaks the development server. This requires to have 2 versions of .htaccess which is pain. So, instead I would like to wrap one AddHandler within a conditional. Something of this sort: IF IP=='1.2.3.4' THEN AddHandler type/foo .ext ENDIF The problem is new but out of my control for now. I know this is far from ideal and the servers used to match 100% as they should but temporarily they cannot.

    Read the article

  • Use virtual pageviews for all goal tracking

    - by Jeff Wu
    I'm new to Google Analytics and I'm wondering if it would be cleaner to user virtual pageviews for all the goal tracking on my website instead of using a mix of regular page views and virtual pageviews. I know in most cases this is just semantics but there are multiple pages where the same goal can be achieved and I think it would be cleaner just to fire the same virtual pageview instead of having two different goal pages. Will this model also give developers more flexibility when they do development? I know we are moving to a CMS and urls can get hairy, so I think this might be a good way to make analytics portion of the site "future proof". Any thoughts are appreciated! Thanks.

    Read the article

  • Telerik Introduces New Developer Tool Designed to Simplify Unit Test Mocking

    JustMock extends Teleriks commitment to providing Visual Studio developer productivity Waltham, MA, April 13, 2010 Telerik, the leading vendor of development tools and user interface components for .NET, today announced the introduction of JustMock, a productivity add-in for Microsoft? Visual Studio 2008 and 2010. JustMock helps developers easily create object mockings in unit tests, saving time and improving the quality of software testing. JustMock is being introduced as a Beta and is scheduled...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Multithreded UI desktop application issues

    - by igor
    I am involved into development a rich UI project: desktop windows application. Application uses asynchronous invocations and in its turn it should be ready to process external messages (events). The problem is clear: at first time it was built as a simple prototype and it was not stress tested and all was fine. Then application was grown: the number of calls to server and number of events from server are high and performance is low. What is more users noticed that sometimes performance is extremal low. Asynchronous invocations based on thread pool (BeginInvoke, EndInvoke), external events are going from WCF service (.NET 3.5). My goal is synchronization of all tasks and putting priorities to every executions in desktop application. My question is: is there any practice how to reach my goal: patterns, task priority list, others? What should I do at first, second and next times? Thanks

    Read the article

  • Desktop login fails, terminal works

    - by Tobias
    I have a freshly setup 12.04 LTS pc system (120 GB SSD, 1 TB HDD, 16 GiB RAM); since a few days, I can't login to the graphical desktop anymore: there is very short flashing shell window which disappears very quickly, and I'm confronted with the login screen again. I believe there is something about modprobe and vbox, but I can't read it fast enough ... I can login to a terminal (Ctrl+Alt+F1). It did not help to chown all contents of my home directory to me:my-group, like suggested here. This is what I could find in /var/log, grepping for the date and time (I inserted linebreaks after <my-hostname>; real time values preserved): auth.log: <date> 22:43:01 <my-hostname> lightdm: pam_succeed_if(lightdm:auth): requirement "user ingroup nopasswdlogin" not met by user "tobias" <date> 22:43:08 <my-hostname> lightdm: pam_unix(lightdm:session): session closed for user lightdm <date> 22:43:08 <my-hostname> lightdm: pam_unix(lightdm:session): session opened for user tobias by (uid=0) <date> 22:43:08 <my-hostname> lightdm: pam_ck_connector(lightdm:session): nox11 mode, ignoring PAM_TTY :0 <date> 22:43:08 <my-hostname> lightdm: pam_unix(lightdm:session): session closed for user tobias <date> 22:43:09 <my-hostname> lightdm: pam_unix(lightdm:session): session opened for user lightdm by (uid=0) <date> 22:43:09 <my-hostname> lightdm: pam_ck_connector(lightdm:session): nox11 mode, ignoring PAM_TTY :0 <date> 22:43:10 <my-hostname> lightdm: pam_succeed_if(lightdm:auth): requirement "user ingroup nopasswdlogin" not met by user "tobias" <date> 22:43:10 <my-hostname> dbus[756]: [system] Rejected send message, 2 matched rules; type="method_call", sender="1:43" (uid=104 pid=1639 comm="/usr/lib/indicator-datetime/indicator-datetime-ser") interface="org.freedesktop.DBus.Properties" member="GetAll" error name="(unset)" requested_reply="0" destination=":1.15" (uid=0 pid=1005 comm="/usr/sbin/console-kit-daemon --no-daemon ") kern.log: <date> 22:43:00 <my-hostname> kernel: [ 16.084525] eth0: no IPv6 routers present syslog: <date> 22:43:00 <my-hostname> kernel: [ 16.084525] eth0: no IPv6 routers present <date> 22:43:01 <my-hostname> ntpdate[1492]: adjust time server 91.189.94.4 offset -0.162831 sec <date> 22:43:08 <my-hostname> acpid: client 969[0:0] has disconnected <date> 22:43:08 <my-hostname> acpid: client connected from 1553[0:0] <date> 22:43:08 <my-hostname> acpid: 1 client rule loaded I have Virtualbox and Truecrypt installed, but I can't think of a reason why they might prevent a graphical login. I'm confused: What is this about requirement "user ingroup nopasswdlogin" not met? I do login using a password, and the password works ok when logging in to a terminal! Can I somehow read the error output, e.g. by delaying it, redirecting it to a file, or having the system prompt me for pressing a key? Has possibly any recent update caused my problem? Should I install the pending updates? How, btw, without access to the graphical UI? I have some working knowledge about the Linux shell, but I'm new to Ubuntu. Any help would be appreciated.

    Read the article

  • links for 2010-05-26

    - by Bob Rhubart
    @vambenempe - Dear Cloud API, your fault line is showing "I am talking about the dreadful state of fault reporting in remote APIs, from Twitter to Cloud interfaces. They are badly described in the interface documentation and the implementations often don’t even conform to what little is documented." -- William Vambenempe (tags: oracle otn cloud) @oraclebase: Consuming Web Services using PL/SQL Oracle ACE Director Tim Hall shares a couple of solutions for consuming web services using PL/SQL. (tags: oracle otn oracleace soa sql webservices) Douwe Pieter van den Bos: IT Project misstep: To Serve and Protect "Thoughts and vision change during time. We gain new insights and other people share their knowledge. This is exactly why software development projects need to be based on a change facilitating manner, not trying to avoid change, or make it more difficult." -- Douwe Pieter van den Bos (tags: oracle otn architect projectmanagement innovation)

    Read the article

  • Oracle Utilities Mobile Workforce Management V2.0 has arrived

    - by Anthony Shorten
    it is finally upon us. Oracle Utilities Mobile Workforce Management (MWM) V2.0 has been released and is now available (see Press Release). This is significant for me as this is the first product to use the new version of the Oracle Utilities Application Framework V4.0.1. This release is very significant as it adds a lot of new functionality to the framework, not just for MWM but will progressively rolled out across a few moew Oracle Utilities products over the next 12 months. Watch the skies for more annoucements. Now that Framework 4.0.1 has been released I will be updating this blog ona regular basis outlining significant features (there are over 60+ features in the new Framework) for you too understand. It has been hard work but it finally has been released and used by the first product off the assembly line we call product development.

    Read the article

  • Apache cannot find mysql database modules

    - by user809857
    I've created a simple django project and setup a mysql database. My simple project just creates an entry on the database. The project works fine when I use the built in development server provided by django (runserver) and it works well. But when I deployed the project on Apache and mod_Wsgi (Ubuntu server), django could not find 'books', which is in this case my table in the database. The mysql database that I use in runserver and apache are just the same. I also did rebuild the database using sqlall,validate and syncdb of django but still i get the error. What could be wrong with what I'm doing? Thanks

    Read the article

  • Web services, J2EE, Spring, DB integration project ideas - maybe data mining related?

    - by saral jain
    I am a graduate Computer Science student (Data Mining and Machine Learning) and have good exposure to core Java (3 years). I have read up on a bunch of stuff on the following topics: Design patterns, J2EE Web services (SOAP and REST), Spring, and Hibernate Java Concurrency - advanced features like Task and Executors. I would now like to do a project combining this stuff -- over my free time of course -- to get a better understanding of these things and to kind of make an end to end software (to learn the best design principles etc + SVN, maven). Any good project ideas would be really appreciated. I just want to build this stuff to learn, so I don't really mind re-inventing the wheel. Also, anything related to data mining would be an added bonus as it fits with my research but is absolutely not necessary since this project is more to learn to do large scale software development.

    Read the article

  • June 25 changes to BIS 742.15 How does it impact SSL iPhone App export compliance

    - by Rob
    This question isn't strictly development-related but I hope it's still acceptable :) On June 25, 2010 the BIS updated 742.15 and of interest to me is the new 742.14(b)(4) "Exclusions from mass market classification request, encryption registration and self-classification reporting requirements" and 742.15(b)(4)(ii) which states… (ii) Foreign products developed with or incorporating U.S.-origin encryption source code, components, or toolkits. Foreign products developed with or incorporating U.S. origin encryption source code, components or toolkits that are subject to the EAR, provided that the U.S. origin encryption items have previously been classified or registered and authorized by BIS and the cryptographic functionality has not been changed. Such products include foreign developed products that are designed to operate with U.S. products through a cryptographic interface. I take this to mean that my Canadian produced product that uses https is now excluded from requiring a CCATTS. What does everyone else think?

    Read the article

  • Mod Puts Mac OS 7 On the Nook Touch

    - by Jason Fitzpatrick
    Thanks to a mac-hardware emulator for Android, it’s now possible to run Mac OS 7 on the Nook Touch (or other Android-based tablet). If you’ve been looking for some retro-goodness to dump on your Nook or tablet–Oregon Trail anyone?–this simple hack will certainly help. Hit up the link below for additional screenshots and more information. Mini vMac for Android Development Thread [via MikeCanex] HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me? HTG Explains: How Windows 8′s Secure Boot Feature Works & What It Means for Linux

    Read the article

  • Configuration data: single-row table vs. name-value-pair table

    - by Heinzi
    Let's say you write an application that can be configured by the user. For storing this "configuration data" into a database, two patterns are commonly used. The single-row table CompanyName | StartFullScreen | RefreshSeconds | ... ---------------+-------------------+------------------+-------- ACME Inc. | true | 20 | ... The name-value-pair table ConfigOption | Value -----------------+------------- CompanyName | ACME Inc. StartFullScreen | true (or 1, or Y, ...) RefreshSeconds | 20 ... | ... I've seen both options in the wild, and both have obvious advantages and disadvantages, for example: The single-row tables limits the number of configuration options you can have (since the number of columns in a row is usually limited). Every additional configuration option requires a DB schema change. In a name-value-pair table everything is "stringly typed" (you have to encode/decode your Boolean/Date/etc. parameters). (many more) Is there some consensus within the development community about which option is preferable?

    Read the article

  • Table Variables: an empirical approach.

    - by Phil Factor
    It isn’t entirely a pleasant experience to publish an article only to have it described on Twitter as ‘Horrible’, and to have it criticized on the MVP forum. When this happened to me in the aftermath of publishing my article on Temporary tables recently, I was taken aback, because these critics were experts whose views I respect. What was my crime? It was, I think, to suggest that, despite the obvious quirks, it was best to use Table Variables as a first choice, and to use local Temporary Tables if you hit problems due to these quirks, or if you were doing complex joins using a large number of rows. What are these quirks? Well, table variables have advantages if they are used sensibly, but this requires some awareness by the developer about the potential hazards and how to avoid them. You can be hit by a badly-performing join involving a table variable. Table Variables are a compromise, and this compromise doesn’t always work out well. Explicit indexes aren’t allowed on Table Variables, so one cannot use covering indexes or non-unique indexes. The query optimizer has to make assumptions about the data rather than using column distribution statistics when a table variable is involved in a join, because there aren’t any column-based distribution statistics on a table variable. It assumes a reasonably even distribution of data, and is likely to have little idea of the number of rows in the table variables that are involved in queries. However complex the heuristics that are used might be in determining the best way of executing a SQL query, and they most certainly are, the Query Optimizer is likely to fail occasionally with table variables, under certain circumstances, and produce a Query Execution Plan that is frightful. The experienced developer or DBA will be on the lookout for this sort of problem. In this blog, I’ll be expanding on some of the tests I used when writing my article to illustrate the quirks, and include a subsequent example supplied by Kevin Boles. A simplified example. We’ll start out by illustrating a simple example that shows some of these characteristics. We’ll create two tables filled with random numbers and then see how many matches we get between the two tables. We’ll forget indexes altogether for this example, and use heaps. We’ll try the same Join with two table variables, two table variables with OPTION (RECOMPILE) in the JOIN clause, and with two temporary tables. It is all a bit jerky because of the granularity of the timing that isn’t actually happening at the millisecond level (I used DATETIME). However, you’ll see that the table variable is outperforming the local temporary table up to 10,000 rows. Actually, even without a use of the OPTION (RECOMPILE) hint, it is doing well. What happens when your table size increases? The table variable is, from around 30,000 rows, locked into a very bad execution plan unless you use OPTION (RECOMPILE) to provide the Query Analyser with a decent estimation of the size of the table. However, if it has the OPTION (RECOMPILE), then it is smokin’. Well, up to 120,000 rows, at least. It is performing better than a Temporary table, and in a good linear fashion. What about mixed table joins, where you are joining a temporary table to a table variable? You’d probably expect that the query analyzer would throw up its hands and produce a bad execution plan as if it were a table variable. After all, it knows nothing about the statistics in one of the tables so how could it do any better? Well, it behaves as if it were doing a recompile. And an explicit recompile adds no value at all. (we just go up to 45000 rows since we know the bigger picture now)   Now, if you were new to this, you might be tempted to start drawing conclusions. Beware! We’re dealing with a very complex beast: the Query Optimizer. It can come up with surprises What if we change the query very slightly to insert the results into a Table Variable? We change nothing else and just measure the execution time of the statement as before. Suddenly, the table variable isn’t looking so much better, even taking into account the time involved in doing the table insert. OK, if you haven’t used OPTION (RECOMPILE) then you’re toast. Otherwise, there isn’t much in it between the Table variable and the temporary table. The table variable is faster up to 8000 rows and then not much in it up to 100,000 rows. Past the 8000 row mark, we’ve lost the advantage of the table variable’s speed. Any general rule you may be formulating has just gone for a walk. What we can conclude from this experiment is that if you join two table variables, and can’t use constraints, you’re going to need that Option (RECOMPILE) hint. Count Dracula and the Horror Join. These tables of integers provide a rather unreal example, so let’s try a rather different example, and get stuck into some implicit indexing, by using constraints. What unusual words are contained in the book ‘Dracula’ by Bram Stoker? Here we get a table of all the common words in the English language (60,387 of them) and put them in a table. We put them in a Table Variable with the word as a primary key, a Table Variable Heap and a Table Variable with a primary key. We then take all the distinct words used in the book ‘Dracula’ (7,558 of them). We then create a table variable and insert into it all those uncommon words that are in ‘Dracula’. i.e. all the words in Dracula that aren’t matched in the list of common words. To do this we use a left outer join, where the right-hand value is null. The results show a huge variation, between the sublime and the gorblimey. If both tables contain a Primary Key on the columns we join on, and both are Table Variables, it took 33 Ms. If one table contains a Primary Key, and the other is a heap, and both are Table Variables, it took 46 Ms. If both Table Variables use a unique constraint, then the query takes 36 Ms. If neither table contains a Primary Key and both are Table Variables, it took 116383 Ms. Yes, nearly two minutes!! If both tables contain a Primary Key, one is a Table Variables and the other is a temporary table, it took 113 Ms. If one table contains a Primary Key, and both are Temporary Tables, it took 56 Ms.If both tables are temporary tables and both have primary keys, it took 46 Ms. Here we see table variables which are joined on their primary key again enjoying a  slight performance advantage over temporary tables. Where both tables are table variables and both are heaps, the query suddenly takes nearly two minutes! So what if you have two heaps and you use option Recompile? If you take the rogue query and add the hint, then suddenly, the query drops its time down to 76 Ms. If you add unique indexes, then you've done even better, down to half that time. Here are the text execution plans.So where have we got to? Without drilling down into the minutiae of the execution plans we can begin to create a hypothesis. If you are using table variables, and your tables are relatively small, they are faster than temporary tables, but as the number of rows increases you need to do one of two things: either you need to have a primary key on the column you are using to join on, or else you need to use option (RECOMPILE) If you try to execute a query that is a join, and both tables are table variable heaps, you are asking for trouble, well- slow queries, unless you give the table hint once the number of rows has risen past a point (30,000 in our first example, but this varies considerably according to context). Kevin’s Skew In describing the table-size, I used the term ‘relatively small’. Kevin Boles produced an interesting case where a single-row table variable produces a very poor execution plan when joined to a very, very skewed table. In the original, pasted into my article as a comment, a column consisted of 100000 rows in which the key column was one number (1) . To this was added eight rows with sequential numbers up to 9. When this was joined to a single-tow Table Variable with a key of 2 it produced a bad plan. This problem is unlikely to occur in real usage, and the Query Optimiser team probably never set up a test for it. Actually, the skew can be slightly less extreme than Kevin made it. The following test showed that once the table had 54 sequential rows in the table, then it adopted exactly the same execution plan as for the temporary table and then all was well. Undeniably, real data does occasionally cause problems to the performance of joins in Table Variables due to the extreme skew of the distribution. We've all experienced Perfectly Poisonous Table Variables in real live data. As in Kevin’s example, indexes merely make matters worse, and the OPTION (RECOMPILE) trick does nothing to help. In this case, there is no option but to use a temporary table. However, one has to note that once the slight de-skew had taken place, then the plans were identical across a huge range. Conclusions Where you need to hold intermediate results as part of a process, Table Variables offer a good alternative to temporary tables when used wisely. They can perform faster than a temporary table when the number of rows is not great. For some processing with huge tables, they can perform well when only a clustered index is required, and when the nature of the processing makes an index seek very effective. Table Variables are scoped to the batch or procedure and are unlikely to hang about in the TempDB when they are no longer required. They require no explicit cleanup. Where the number of rows in the table is moderate, you can even use them in joins as ‘Heaps’, unindexed. Beware, however, since, as the number of rows increase, joins on Table Variable heaps can easily become saddled by very poor execution plans, and this must be cured either by adding constraints (UNIQUE or PRIMARY KEY) or by adding the OPTION (RECOMPILE) hint if this is impossible. Occasionally, the way that the data is distributed prevents the efficient use of Table Variables, and this will require using a temporary table instead. Tables Variables require some awareness by the developer about the potential hazards and how to avoid them. If you are not prepared to do any performance monitoring of your code or fine-tuning, and just want to pummel out stuff that ‘just runs’ without considering namby-pamby stuff such as indexes, then stick to Temporary tables. If you are likely to slosh about large numbers of rows in temporary tables without considering the niceties of processing just what is required and no more, then temporary tables provide a safer and less fragile means-to-an-end for you.

    Read the article

  • Az OTP Bank az Oracle Warehouse Builder-t használja

    - by Fekete Zoltán
    Az Oracle.com-on az ügyfél sikertörténetek között az imént jelent meg a következo dokumentum: OTP Bank Data Warehouse Development Team Improves Service Level and Lowers Reporting Lead Time for Business Fields by 80%, azaz az OTP Bank az adattárház fejlesztéshez az Oracle Warehouse Builder ETL-ELT eszközt használja. AZ OTP Bank Tranzakciós Adattárház fejleszto csapata magasabb minoségi szintre emelte a belso megrendeloknek nyújtott szoltáltatásait, amely egyik eredménye, hogy 80%-al csökkentette az üzletágak közötti riportolási folyamatok átfutási idotartamát. A magyar nyelvu sikertörténet innen töltheto le. A legfontosabb eredmények az OWB kapcsán: - ETL folyamatok sztenderdizációján keresztül elért adatminoség javulás, OWB - Oracle Business Intelligence EE: az üzleti területek és az IT fejlesztés közötti együttmoködés hatékonyabb - sztenderdizált ETL és riportolási folyamatok: - fix jelentés készletek hatására tudatos üzleti metaadat kezelés - egységes terminológia - komplex banki folyamatok pontos ismerete: üzleti területek és IT fejlesztok számára - hatékony banki együttmoködés - a megrendeléstol az adatpublikációig tartó folyamatok idotartama lecsökkent - az ad-hoc riportok elkészítése a korábbi 1,5 hétrol 80%-al, átlagosan 2 munkanapra csökkent

    Read the article

  • New to Java and Spring. What are some good design principles for an inexperienced java developer like me?

    - by Imtiaz Ahmad
    I am learning Java and have written a few small useful programs. I am new to spring but have managed to understand the concept of dependency injection for decoupling. I'm trying to applying that in my development work in an enterprise setting. What are the 3 most important design patterns I should master (not for interview purposes but ones that I will use every day in as a good java developer)? Also what are some good java design considerations and practices in coding specifically in Java? My goal is write good decoupled and coherent programs that are easy to maintain that don't make me standout as a java rookie. Stuff like not beginning my package names with com. have already made me precariously visible in my team. But they know I have 2 years of coding experience and its not in java.

    Read the article

  • How to Install & Use the Window Maker Desktop Environment on Ubuntu

    - by Chris Hoffman
    Window Maker is a Linux desktop environment designed to emulate NeXTSTEP, which eventually evolved into Mac OS X. With its focus on emulating NeXTSTEP, it eschews the task bars and application menu buttons found in many other lightweight desktop environments. Window Maker is now under active development again after seven years without an official release. A lot has changed on the Linux desktop front since Window Maker was last being actively developed, but Window Maker still provides a unique, minimal environment – for users looking for that sort of thing. How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage

    Read the article

  • TechEd Israel 2010 may only accept speakers from sponsors

    A month or so ago, Microsoft Israel started sending out emails to its partners and registered event users to Save the date! Micraoft Teched Israel is coming, and its going to be this november! Great news I thought to myself. Id been to a couple of the MS teched events, as a speaker and as an attendee, and it was lovely and professionally done. Israel is an amazing place for technology and development and TechEd hosted some big names in the world of MS software. A couple of weeks ago, I was shocked...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • TechEd Israel 2010 may only accept speakers from sponsors

    A month or so ago, Microsoft Israel started sending out emails to its partners and registered event users to Save the date! Micraoft Teched Israel is coming, and its going to be this november! Great news I thought to myself. Id been to a couple of the MS teched events, as a speaker and as an attendee, and it was lovely and professionally done. Israel is an amazing place for technology and development and TechEd hosted some big names in the world of MS software. A couple of weeks ago, I was shocked...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

< Previous Page | 729 730 731 732 733 734 735 736 737 738 739 740  | Next Page >