Search Results

Search found 20761 results on 831 pages for 'chef client'.

Page 433/831 | < Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >

  • Erlang web frameworks survey

    - by Zachary K
    (Inspired by similar question on Haskel) There are several web frameworks for Erlang like Nitrogen, Chicago Boss, and Zotonic, and a few more. In what aspects do they differ from each other? For example: features (e.g. server only, or also client scripting, easy support for different kinds of database) maturity (e.g. stability, documentation quality) scalability (e.g. performance, handy abstraction) main targets Also, what are examples of real-world sites / web apps using these frameworks? EDIT: Starting a bounty in hopes that it will get some conversation going

    Read the article

  • An alternative way to request read reciepts

    - by lavanyadeepak
    An alternative way to request read reciepts Sometime or other we use messaging namespaces like System.Net.Mail or System.Web.Mail to send emails from our applications. When we would need to include headers to request delivery or return reciepts (often called as Message Disposition Notifications) we lock ourselves to the limitation that not all email servers/email clients can satisfy this. We can enhance this border a little now, thanks to a new innovation I discovered from Gawab. It embeds a small invisible image of 1x1 dimension and the image source reads as recieptimg.php?id=2323425324. When this image is requested by the web browser or email client, the serverside handler does a smart mapping based on the ID to indicate that the message was read. We call them as 'Web Bugs'. But wait it is not a fool proof solution since spammers misuse this technique to confirm activeness of an email address and most of the email clients suppress inline images for security reasons. I just thought anyway would share this observation for the benefit of others.

    Read the article

  • Internal and external API architecture

    - by Tacomanator
    The company I work for maintains a successful SaaS product that grew "organically" over the years. We are planning to expand the line with a suite of new products that will share data with the existing product. To support this, we are looking to consolidate business logic into a single place: a web service layer. The WS layer will be used by: The web applications A tool to import data A tool to integrate with other client software (not an API per se) We also want to create an API that can be used by our customers that are capable of using it to create their own integrations. We are struggling with the following question: Should the internal API (aka the WS layer) and the external API be one in the same, with security and permission settings to control what can be done by who, or should they be two separate applications where the external API just calls the internal API like any other application? So far in our debate it seems that separating them may be more secure, but will add overhead. What have others done in a similar situation?

    Read the article

  • Register now for the FREE Tech Days Online Conference January 20th

    - by Eric Nelson
    The perfect solution to the “January blues” is a good solid few hours learning about great technology. The 'Build an app for that' Online Conference is exactly that, featuring demo-rich sessions on building applications for the browser, Windows 7, and Windows Phone 7. There are three tracks letting you choose which sessions are most relevant to you - whether you're just considering client development with Silverlight, or you've already got stuck in to an advanced project. We'll also explore new form factors such as Phone and Slate, and how to develop touch-based applications. Finally we'll cover the important subject of how to create beautifully designed user interfaces. Register now Agenda:

    Read the article

  • How to connect to a WCF service using IP of the host machine where the service is hosted?

    - by Kumar
    I have a secured WCF service (https://<MachineName>:sslport/services) self hosted in a machine. Different instances of same service are deployed in differnt machines. From a client app, I am able to connect to theses services through code, i.e. using ChannelFactory() with the same endpoint address. But if I try to access the service using the endpoint address as https://<ipAddress>:sslport/services replacing machines name with machine IP address, I am getting some error stating "could not establish trust relationship". I know this is an error caused by SSL certificate that it could not establish a trust relationship. Are there any settings or any possibilities to make this work?

    Read the article

  • Ubuntu One under KDE

    - by Andy Goss
    Currently, while Ubuntu One can be installed under KDE, Dolphin does not know about it, so it cannot set folders other than Ubuntu One to sync. There is a command line utility, "u1sdtool", which performs most if not all the functions you need. Setting the keyring password to be the same as your login won't prevent Ubuntu One asking you for it again on login. If you want to change it, install Seahorse, look for it in your menus as "Passwords and Keys", and right click over "Passwords: default", then select "Change Password". The various deleting tricks I've read about don't work for me. There is an ubuntuone-client-kde being developed, but no clue as to when it will appear in a stable repo. Any further advice will be welcomed by me and doubtless a few other KDE users.

    Read the article

  • Group "Discussion" software?

    - by Kayle
    My client wants a "lite" forum... not unlike these stack exchange sites, but even a little lighter. There's a screenshot of the discussion group she likes most, below. You can also go here to see it for yourself it you like. I don't think traditional forum apps will display, functionally, in this manner. Is there any software I can use to get a similar result? A web service would be acceptable as well.

    Read the article

  • [News] Un plugin TFS pour Eclipse

    Suite au r?cent rachat de la soci?t? Teamprise en Novembre, Microsoft annonce la disponibilit? d'un plugin TFS pour Eclipse d?nomm? Visual Studio Team Explorer 2010 : " The beta release contains what we consider to be the essential features necessary to claim that we?re a client for TFS 2010. We?ve been trying to strike a balance between including 2010 features, and getting the product to market, so you won?t see everything here yet (...) " . Les copies d'?cran de ce plugin qui semble gratuit sont disponibles sur le blog en question.

    Read the article

  • What's up with all the updates? [closed]

    - by Bob Babb
    I use Ubuntu exclusively for my job, especially for the fact that everything works and I get the most out of my processor and memory, but you are killing me with updates! I just lost a very good opportunity from a client that installed Ubuntu but got tired of all the updates. I really can't argue the fact. In a matter of a day I had two software updates. Quote from customer: "It's sad that I come in at 6:00 in the morning to install updates from a LTS version, and then before I leave at the end of the day I have 19 new updates to install. At least Microsoft bundles them in controllable groups." Sadly I have to agree, guys you have to do something about this. Please!

    Read the article

  • How do I control remote Windows 7 machine? VNC/RDP/?

    - by artfulrobot
    I have a Windows 7 laptop that's going out and about and I'd like to be able to admin' it from my Ubuntu (precise) desktop. While the laptop is in my office I can use Gnome's "Remote Desktop Viewer" (Vinagre) to connect via RDP. However, when it's in the office I don't really need RDP of course! And when it's out of the office I cannot control the routers/firewalls along the way, so I won't know the IP and cannot set up port forwards. I'm looking for a Ubuntu compatible way to achieve this. Nb. I can set up port forwards at the my office end, so a service on the laptop could try a "backwards" connection to my client.

    Read the article

  • Installing software on an offline Ubuntu server

    - by Muhammad Gelbana
    Assuming that I have a server with Ubuntu server newly installed on it. I was thinking of installing the very same version on Virtual Box (Or any other virtualization software), connect it to the internet and use apt-get to only download the packages for upgrading the system and the new software such as (tomcat7, openjdk6-default-headless..etc). Then copy the downloaded packages from the archive folder to the offline server's archive folder through a USB stick. So now the virtual system won't actually be upgraded nor have any new software installed. But would running the very same apt-get commands on the offline system without the download directive -d be executed without issues ? *EDIT:*This needs to be as simple as possible because I'll have to write a guide for our client to do this on his own and so it won't be acceptable to require deep Linux knowledge to do this.

    Read the article

  • Why everybody should do Sales!

    - by FelixWehmeyer
    I speak with many business students and ask them what job they want to get into. Most of them tell me they want a job in Marketing, Management Consulting or Finance. I hardly ever hear “Sales, that is what I want to do”, and I often wonder why. I would like to start with a quote from Zig Ziglar, a successful salesman: "Nothing happens until someone sells something." But to get back to the main point, why wouldn’t you want to get in sales? When people think of sales, they picture a typical salesman in their head and think that selling is scary and all about manipulating, pressuring and pushing someone into buying something they don’t need. Are these stereotypes accurate? I don’t believe so: So why should you want to be in sales? If you think about selling as providing the solution for the problem and talking about the benefits of making a decision, then every job in this world comes out of selling. In every job you deal with coworkers that you want to convince of your ideas or convincing your boss that the project you want to work on is good for the company.  These days, consumers and businesses are very well informed about services and products. When we are talking about highly complex products, such as IT solutions, businesses don’t accept your run-of-the-mill salesman who is pushing a sale. These are often long projects where salespeople have a consulting and leading role. Salespeople need to be able to consult companies and customers with their problem and convince a client that their solution is the best fit. Next to the fact that sales, is by far, not as scary and shady as you thought, there are a few points that will make you want to consider a sales career: Negotiating skills – When you are in sales you will learn how to negotiate. Salespeople learn to listen to their customers and try to make them happy, overcoming objections and come to a final agreement that both parties are happy with. Persistence/Challenge – As a salesperson you will often hear a negative answer, in a sales role you will start to embrace this and see a ‘no’ as a challenge not as a rejection. This attitude change can help you a lot in your career, but also in your personal life. You will become more optimistic and gain a go-getter attitude. Salary – As salespeople are seen as the moneymakers for the company, companies often reward their sales teams generously. Most likely in a sales role, you will receive a good basic salary and often you get nice bonuses on top of that based on your performance. Oracle is, for instance, the company that offers the highest average commission in the world. Further you can expect many other benefits as companies know that there is a high demand for good salespeople. Teamwork – Sales is a lot like having your own business, you are responsible for your own territory or set of clients. You are the one who is responsible for the revenue coming from that territory. So in order to gain revenue you will have to work together with many departments and people to make that happen. Every (potential) client could be seen as a different project, and you are the project leader. Understanding customers and the business – From any job that you choose sales will get you the most insight in the market. Salespeople are usually well-connected, talk with different customers and learn about the market and are up-to-date about all latest changes. Even if you want to change to a different role in the long run, you have a great head start as you understand the market and customers like no one else. Job security – Look at all the job postings out there. Many of them are sales-related. So if you want to have a steady job, plenty of choice and companies willing to invest in you, sales could be something for you.  Are you interested in exploring a sales career? At Oracle we are always looking for good sales professionals and fresh graduates who want to get into sales! For many languages such as Flemish, Dutch, German, French, Swedish and Norwegian (and more) we are currently looking for graduates who want to develop their career in Oracle. Please have a look at this article for the experience of a Business Development Consultant at Oracle in Dublin. Want to learn more about this job check out this link or send an email to jessica.ebbelaar-at-oracle.com! Have a look at our website http://campus.oracle.com for all of our other latest sales and non-sales vacancies!

    Read the article

  • ASP.NET MVC 2 Released!

    - by kaleidoscope
    ASP.NET MVC 2 Released! ASP.NET MVC 2 Features ASP.NET MVC 2 adds a bunch of new capabilities and features. Some of the new features and capabilities include: § New Strongly Typed HTML Helpers § Enhanced Model Validation support across both server and client § Auto-Scaffold UI Helpers with Template Customization § Support for splitting up large applications into ‘Areas’ § Asynchronous Controllers support that enables long running tasks in parallel § Support for rendering sub-sections of a page/site using Html.RenderAction § Lots of new helper functions, utilities, and API enhancements § Improved Visual Studio tooling support More details can be found at http://www.azurejournal.com/2010/03/aspnet-MVC-2-released/ http://www.asp.net/mvc/   Anish, S

    Read the article

  • Webcor Builders Coordinates Construction Schedules and Mitigates Potential Delays More Efficiently with Integrated Project Management

    - by Sylvie MacKenzie, PMP
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} With more than 40 years of commercial construction experience, Webcor Builders is a leading builder of distinguished, high-profile projects, including high-rise condominiums and hotels, laboratories, healthcare centers, and public works projects. Webcor is also known for its award-winning concrete, interior construction, historic restoration, and seismic renovation work. The company has completed more than 50 million square feet of projects to date. Considering the variety and complexity of the construction projects Webcor undertakes, an integrated project management solution is critical to ensuring optimal efficiency and completing client projects on time and on budget. The company previously used a number of scheduling systems for its various building projects. These packages provided different levels of schedule detail and required schedulers, engineers, and other employees to learn multiple systems. From an IT cost and complexity perspective, the company had to manage multiple scheduling systems and pay for multiple sets of licenses. The company looked to standardize on an enterprise project management system, and selected Oracle’s Primavera P6 Enterprise Project Portfolio Management. Webcor uses the solution’s advanced capabilities to schedule complex projects, analyze delays, model and propose multiple scenarios to demonstrate and mitigate delays and cost overruns, and process that information efficiently to deliver the scheduling precision that public and private projects require. In fact, the solution was instrumental in helping the company’s expansion into public sector projects during the recent economic downturn, and with Primavera P6 in place, it can deliver the precise schedule reporting required for large public projects. With Primavera P6 in place, the company could deliver the precise scheduling and milestone reporting capabilities required for large public projects. The solution is in managing the high-profile University of California – Berkeley Memorial Stadium project. Webcor was hired as construction manager and general contractor for the stadium renovation project, which is a fast-paced project located near the seismically active Hayward Fault Zone. Due to the University of California’s football schedule, meeting the Universities deadline for the coming season placed Webcor in a situation where risk awareness and early warnings of issues would be paramount. Webcor and the extended project team needed a solution that could instantly analyze alternate scenarios to mitigate potential delays; Primavera would deliver those answers.The team would also need to enable multiple stakeholders to use an internet-based platform to access the schedule from various locations, and model complicated sequencing requirements where swift decisions would be made to keep the project on track. The schedule is an integral part of Webcor’s construction management process for the stadium project. Rather than providing the client with the industry-standard monthly update, Webcor updates the critical path method (CPM) schedule on a weekly basis. The project team also reviews the schedule and updates weekly to confirm that progress and forecasted performance are accurate. Hired by the University for their ability to deliver in high risk environments The Webcor team was hit recently with a design supplement that could have added up to 70 days to the project. Using Oracle Primavera P6 the team sprung into action analyzing multiple “what if” scenarios to review mitigation means and methods.  Determined to make sure the Bears could take the field in the coming season the project team nearly eliminated the impact with their creative analysis in working the schedule. The total time from the issuance of the final design supplement to an agreed mitigation response was less than one week; leveraging the Oracle Primavera solution Webcor was able to deliver superior customer value With the ability to efficiently manage projects and schedules, Webcor can ensure it completes its projects on time and on budget, as well as inform clients about what changes to plans will mean in terms of delays and additional costs. Read the complete customer case study at :  http://www.oracle.com/us/corporate/customers/customersearch/webcor-builders-1-primavera-ss-1639886.html

    Read the article

  • How to structure a set of RESTful URLs

    - by meetamit
    Kind of a REST lightweight here... Wondering which url scheme is more appropriate for a stock market data app (BTW, all queries will be GETs as the client doesn't modify data): Scheme 1 examples: /stocks/ABC/news /indexes/XYZ/news /stocks/ABC/time_series/daily /stocks/ABC/time_series/weekly /groups/ABC/time_series/daily /groups/ABC/time_series/weekly Scheme 2 examples: /news/stock/ABC /news/index/XYZ /time_series/stock/ABC/daily /time_series/stock/ABC/weekly /time_series/index/XYZ/daily /time_series/index/XYZ/weekly Scheme 3 examples: /news/stock/ABC /news/index/XYZ /time_series/daily/stock/ABC /time_series/weekly/stock/ABC /time_series/daily/index/XYZ /time_series/weekly/index/XYZ Scheme 4: Something else??? The point is that for any data being requested, the url needs to encapsulate whether an item is a Stock or an Index. And, with all the RESTful talk about resources I'm confused about whether my primary resource is the stock & index or the time_series & news. Sorry if this is a silly question :/ Thanks!

    Read the article

  • My error with upgrading 4.0 to 4.2- What NOT to do...

    - by Steve Tunstall
    Last week, I was helping a client upgrade from the 2011.1.4.0 code to the newest 2011.1.4.2 code. We downloaded the 4.2 update from MOS, upload and unpacked it on both controllers, and upgraded one of the controllers in the cluster with no issues at all. As this was a brand-new system with no networking or pools made on it yet, there were not any resources to fail back and forth between the controllers. Each controller had it's own, private, management interface (igb0 and igb1) and that's it. So we took controller 1 as the passive controller and upgraded it first. The first controller came back up with no issues and was now on the 4.2 code. Great. We then did a takeover on controller 1, making it the active head (although there were no resources for it to take), and then proceeded to upgrade controller 2. Upon upgrading the second controller, we ran the health check with no issues. We then ran the update and it ran and rebooted normally. However, something strange then happened. It took longer than normal to come back up, and when it did, we got the "cluster controllers on different code" error message that one gets when the two controllers of a cluster are running different code. But we just upgraded the second controller to 4.2, so they should have been the same, right??? Going into the Maintenance-->System screen of controller 2, we saw something very strange. The "current version" was still on 4.0, and the 4.2 code was there but was in the "previous" state with the rollback icon, as if it was the OLDER code and not the newer code. I have never seen this happen before. I would have thought it was a bad 4.2 code file, but it worked just fine with controller 1, so I don't think that was it. Other than the fact the code did not update, there was nothing else going on with this system. It had no yellow lights, no errors in the Problems section, and no errors in any of the logs. It was just out of the box a few hours ago, and didn't even have a storage pool yet. So.... We deleted the 4.2 code, uploaded it from scratch, ran the health check, and ran the upgrade again. once again, it seemed to go great, rebooted, and came back up to the same issue, where it came to 4.0 instead of 4.2. See the picture below.... HERE IS WHERE I MADE A BIG MISTAKE.... I SHOULD have instantly called support and opened a Sev 2 ticket. They could have done a shared shell and gotten the correct Fishwork engineer to look at the files and the code and determine what file was messed up and fixed it. The system was up and working just fine, it was just on an older code version, not really a huge problem at all. Instead, I went ahead and clicked the "Rollback" icon, thinking that the system would rollback to the 4.2 code.   Ouch... What happened was that the system said, "Fine, I will delete the 4.0 code and boot to your 4.2 code"... Which was stupid on my part because something was wrong with the 4.2 code file here and the 4.0 was just fine.  So now the system could not boot at all, and the 4.0 code was completely missing from the system, and even a high-level Fishworks engineer could not help us. I had messed it up good. We could only get to the ILOM, and I had to re-image the system from scratch using a hard-to-get-and-use FishStick USB drive. These are tightly controlled and difficult to get, almost always handcuffed to an engineer who will drive out to re-image a system. This took another day of my client's time.  So.... If you see a "previous version" of your system code which is actually a version higher than the current version... DO NOT ROLL IT BACK.... It did not upgrade for a very good reason. In my case, after the system was re-imaged to a code level just 3 back, we once again tried the same 4.2 code update and it worked perfectly the first time and is now great and stable.  Lesson learned.  By the way, our buddy Ryan Matthews wanted to point out the best practice and supported way of performing an upgrade of an active/active ZFSSA, where both controllers are doing some of the work. These steps would not have helpped me for the above issue, but it's important to follow the correct proceedure when doing an upgrade. 1) Upload software to both controllers and wait for it to unpack 2) On controller "A" navigate to configuration/cluster and click "takeover" 3) Wait for controller "B" to finish restarting, then login to it, navigate to maintenance/system, and roll forward to the new software. 4) Wait for controller "B" to apply the update and finish rebooting 5) Login to controller "B", navigate to configuration/cluster and click "takeover" 6) Wait for controller "A" to finish restarting, then login to it, navigate to maintenance/system, and roll forward to the new software. 7) Wait for controller "A" to apply the update and finish rebooting 8) Login to controller "B", navigate to configuration/cluster and click "failback"

    Read the article

  • .pam_environment in kerberized nfs4 home directory

    - by Paul Stoever
    How can I get pam_env to read the user's .pam_environment file, if the user's file is located in a kerberized NFS4 mount? The file and directory permissions for the .pam_environment file are set in a way, that allows the local root to read the file. Reading .pam_environment only fails on the first login. Subsequent logins successfully read the file. The client uses Ubuntu 12.04 Desktop, NFS/Kerberos server is 12.04 Server. The Kerberos/NFS4 stuff works with exception of this. From /var/log/auth for first login: ... lightdm: pam_krb5(lightdm:auth): user USERNAME authenticated as USERNAME@REALM lightdm: pam_unix(lightdm:session): session closed for user lightdm lightdm: pam_env(lightdm:setcred): Unable to open config file: USERHOME/.pam_environment: Permission denied lightdm: pam_env(lightdm:setcred): Unable to open config file: USERHOME/.pam_environment: Permission denied lightdm: pam_unix(lightdm:session): session opened for user USERNAME by (uid=0) ...

    Read the article

  • Oracle Fusion CRM and Lotus Notes Integration by iEnterprises

    - by Richard Lefebvre
    Integrate Oracle Fusion CRM and Lotus Notes in one easy step with nothing to install other than a 'plugin' for your Lotus Notes client. The Lotus Notes Connector for Oracle Fusion CRM developed by iEnterprises is an easy to use tool that allows you to instantly synchronize your Lotus Notes email, calendar, ToDos and PAB (Personal Address Book) to and from your Oracle Fusion CRM system. It removes the need for time consuming copy and paste between these two systems. For more information, a solution data-sheet and/or to request a trial please visit .... http://www.ienterprises.com/products/lotus-notes-connector/connector-for-fusion.html or contact Matt Hatherley ([email protected])

    Read the article

  • Fastest way to run a JSON server on my local machine

    - by Mohsen
    I am a front-end developer. For many experiemnets I do I need to have a server that talks JSON with my client side app. Normally that server is a simple server that response to my POSTs and GETs. For example I need to setup a server that saves, modifies and read data from a "library" database like this: POST /books create a book GET /book/:id gets a book and so on... What is the fastest and easiest technology stack for database and server in this case? I am open to use Ruby, Nodejs and anything that do the job fast and easy. Is there any framework (on any language) that do stuff like this for me?

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • how does server communication work in a flash game with a php backend

    - by Tim Rogers
    I am trying to create a browser game using actionscript/flash. Currently, I'm trying to understand how I would go about creating a back-end which interfaced with my MySQL database. As far as I understand, If I create a php file on a webserver called test.php and then navigate to a webpage hosted on the server eg. www.example.com/test, the php script will run and display the result in my browser. This would use http. Is this how communication between client and server usually works in a flash game? for example, if the game needed to query the db. Would actionscript have to essentially invoke the url of the php script that would execute the query? it could then parse the data and use it. If this is the case, then is JSON considered a good way to transfer data over http?

    Read the article

  • Issues with the intended behavior of a Service layer?

    - by Rafael Cichocki
    This analysis makes sense, and states anything that avoids code duplication and simplifies maintenance speaks for a service layer. What is the technical behavior? When a service client references a service, does it do so at runtime, or does it happen at compile time? When I change something in the service layer code, will this change be automatically taken into account in all it's clients, or do they need to be individually recompiled? How does this make sense from a testing point of view - I have working code, based on some code from a service, but if that service changes, my code might break?!

    Read the article

  • Internal and external API architecture

    - by Tacomanator
    The company I work for maintains a successful SaaS product that grew "organically" over the years. We are planning to expand the line with a suite of new products that will share data with the existing product. To support this, we are looking to consolidate business logic into a single place: a web service layer. The WS layer will be used by: The web applications A tool to import data A tool to integrate with other client software (not an API per se) We also want to create an API that can be used by our customers that are capable of using it to create their own integrations. We are struggling with the following question: Should the internal API (aka the WS layer) and the external API be one in the same, with security and permission settings to control what can be done by who, or should they be two separate applications where the external API just calls the internal API like any other application? So far in our debate it seems that separating them may be more secure, but will add overhead. What have others done in a similar situation?

    Read the article

  • PanelGridLayout - A Layout Revolution

    - by Duncan Mills
    With the most recent 11.1.2 patchset (11.1.2.3) there has been a lot of excitement around ADF Essentials (and rightly so), however, in all the fuss I didn't want an even more significant change to get missed - yes you read that correctly, a more significant change! I'm talking about the new panelGridLayout component, I can confidently say that this one of the most revolutionary components that we've introduced in 11g, even though it sounds rather boring. To be totally accurate, panelGrid was introduced in 11.1.2.2 but without any presence in the component palette or other design time support, so it was largely missed unless you read the release notes. However in this latest patchset it's finally front and center. Its time to explore - we (really) need to talk about layout.  Let's face it,with ADF Faces rich client, layout is a rather arcane pursuit, once you are a layout master, all bow before you, but it's more of an art than a science, and it is often, in fact, way too difficult to achieve what should (apparently) be a pretty simple. Here's a great example, it's a homework assignment I set for folks I'm teaching this stuff to:  The requirements for this layout are: The header is 80px high, the footer is 30px. These are both fixed.  The first section of the header containing the logo is 180px wide The logo is centered within the top left hand corner of the header  The title text is start aligned in the center zone of the header and will wrap if the browser window is narrowed. It should be aligned in the center of the vertical space  The about link is anchored to the right hand side of the browser with a 20px gap and again is center aligned vertically. It will move as the browser window is reduced in width. The footer has a right aligned copyright statement, again middle aligned within a 30px high footer region and with a 20px buffer to the right hand edge. It will move as the browser window is reduced in width. All remaining space is given to a central zone, which, in this case contains a panelSplitter. Expect that at some point in time you'll need a separate messages line in the center of the footer.  In the homework assigment I set I also stipulate that no inlineStyles can be used to control alignment or margins and no use of other taglibs (e.g. JSF HTML or Trinidad HTML). So, if we take this purist approach, that basic page layout (in my stock solution) requires 3 panelStretchLayouts, 5 panelGroupLayouts and 4 spacers - not including the spacer I use for the logo and the contents of the central zone splitter - phew! The point is that even a seemingly simple layout needs a bit of thinking about, particulatly when you consider strechting and browser re-size behavior. In fact, this little sample actually teaches you much of what you need to know to become vaguely competant at layouts in the framework. The underlying result of "the way things are" is that most of us reach for panelStretchLayout before even finishing the first sip of coffee as we embark on a new page design. In fact most pages you will see in any moderately complex ADF page will basically be nested panelStretchLayouts and panelGroupLayouts, sometimes many, many levels deep. So this is a problem, we've known this for some time and now we have a good solution. (I should point out that the oft-used Trinidad trh tags are not a particularly good solution as you're tie-ing yourself to an HTML table based layout in that case with a host of attendent issues in resize and bi-di behavior, but I digress.) So, tadaaa, I give to you panelGridLayout. PanelGrid, as the name suggests takes a grid like (dare I say slightly gridbag-like) approach to layout, dividing your layout into rows and colums with margins, sizing, stretch behaviour, colspans and rowspans all rolled in, all without the use of inlineStyle. As such, it provides for a much more powerful and consise way of defining a layout such as the one above that is actually simpler and much more logical to design. The basic building blocks are the panelGridLayout itself, gridRow and gridCell. Your content sits inside the cells inside the rows, all helpfully allowing both streching, valign and halign definitions without the need to nest further panelGroupLayouts. So much simpler!  If I break down the homework example above my nested comglomorate of 12 containers and spacers can be condensed down into a single panelGrid with 3 rows and 5 cell definitions (39 lines of source reduced to 24 in the case of the sample). What's more, the actual runtime representation in the browser DOM is much, much simpler, and clean, with basically one DIV per cell (Note that just because the panelGridLayout semantics looks like an HTML table does not mean that it's rendered that way!) . Another hidden benefit is the runtime cost. Because we can use a single layout to achieve much more complex geometries the client side layout code inside the browser is having to work a lot less. This will be a real benefit if your application needs to run on lower powered clients such as netbooks or tablets. So, it's time, if you're on 11.1.2.2 or above, to smile warmly at your panelStretchLayouts, wrap the blanket around it's knees and wheel it off to the Sunset Retirement Home for a well deserved rest. There's a new kid on the block and it wants to be your friend. 

    Read the article

  • Manipulating DOM using jQuery

    - by bipinjoshi
    By now you know how to alter existing elements and their attributes. jQuery also allows you insert, append, remove and replace elements from HTML DOM so that you can modify the document structure. For example, say you are calling some WCF service from client script and based on its return values need to generate HTML table on the fly. In this article I am going to demonstrate some of these jQuery features by developing a simple RSS gadget that grabs and displays RSS feed items in a web form.http://www.bipinjoshi.net/articles/90f4fbef-ef36-467b-8ca0-d0922a5902b0.aspx 

    Read the article

< Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >