Search Results

Search found 28603 results on 1145 pages for 'active users'.

Page 177/1145 | < Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >

  • How to implement User routing like that in StackOverflow ?

    - by rockinthesixstring
    I've looked at the routing on StackOverflow and I've got a very noobie question, but something I'd like clarification none the less. I'm looking specifically at the Users controller http://stackoverflow.com/Users http://stackoverflow.com/Users/Login http://stackoverflow.com/Users/124069/rockinthesixstring What I'm noticing is that there is a "Users" controller probably with a default "Index" action, and a "Login" action. The problem I am facing is that the login action can be ignored and a "UrlParameter.Optional [ID]" can also be used. How exactly does this look in the RegisterRoutes collection? Or am I missing something totally obvious? EDIT: Here's the route I have currently.. but it's definitely far from right. routes.MapRoute( _ "Default", _ "{controller}/{id}/{slug}", _ New With {.controller = "Events", .action = "Index", .id = UrlParameter.Optional, .slug = UrlParameter.Optional} _ )

    Read the article

  • MongoMapper: how do I create a model like this

    - by Vladimir R
    Suppose we have two models, Task and User. So a user can have many tasks and tasks should be able to have many users too. But, a task should also have a unique creator who is also a user. Exemple: A task in this context is like this: Task ID, Task Creator, Users who should do the task User_1 creates a task and he is then the creator. User_1 specifies User_2 and User_3 as users who should do the task. So these two last users are not creators of task. How do I create this models so that if I have a task object, I can find it's creator and users who should complete it. And how do I do, if I have a user, to find all tasks he created and all tasks he should complete. Thank you.

    Read the article

  • how to model a many to many relationship

    - by Maulin
    Here is the scenario, Articles have many Comments Users can write many Comments for many Articles The comments table contains both user_id article_id as foreign keys My models are set up like so class User < ActiveRecord::Base has_many :comments has_many :articles, :through => :comments class Article < ActiveRecord::Base has_many :comments has_many :users, :through => :comments class Comment < ActiveRecord::Base belongs_to :users belongs_to :articles My routes.rb has the following code map.resources :articles, :has_many => :comments map.resources :users, :has_many => :comments which produces the following routes new_article_comment edit_article_comment new_user_comment edit_user_comment etc... This is not what I want (atleast not what I think I want), since comments must always be related to users and article, how can I get a route like so new_user_article_comment edit_user_article_comment Then I could just do new_user_article_comment_path([@user, @article]) to create a new comment

    Read the article

  • Converting "User Shell Folders" registry value

    - by Sach
    The following registry key contains many system default folder locations. HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders The value for the path of the All Users desktop, which is found there, is as follows: XP or earlier : [%ALLUSERSPROFILE%\Desktop] Vista or later: [%PUBLIC%\Desktop] Whereas the actual paths of the All User desktops, respectively, are as follows: XP or earlier : "C:\Documents and Settings\All Users\Desktop" Vista or later: "C:\Users\Public\Desktop" Now, if you use copy and paste the above registry values in Windows Explorer and hit enter it takes you to the actual folders. For example, if you paste [%PUBLIC%\Desktop] in a Windows Explorer in Vista it takes you to ["C:\Users\Public\Desktop"]. My question is this; how do I reproduce this behavior from withing a C# program? To be more specific, if I retrieve the registry value [%PUBLIC%\Desktop] from withing a C# program, which I can do easily, how do I convert it to ["C:\Users\Public\Desktop"]? Obviously I'm not looking for a string replacement, I need to do what Windows does.

    Read the article

  • SQL for total count and count within that where condition is true

    - by twmulloy
    Hello, I have a single user table and I'm trying to come up with a query that returns the total count of all users grouped by date along with the total count of users grouped by date who are of a specific client. Here is what I have thus far, where there's the total count of users grouped by date, but can't seem to figure out how to get the count of those users where user.client_id = x SELECT user.created, COUNT(user.id) AS overall_count FROM user GROUP BY DATE(user.created) trying for a row result like this: [created] => 2010-05-15 19:59:30 [overall_count] => 10 [client_count] => (some fraction of overall count, the number of users where user.client_id = x grouped by date)

    Read the article

  • ruby on rails has_many through relationship

    - by BennyB
    Hi i'm having a little trouble with a has_many through relationship for my app and was hoping to find some help. So i've got Users & Lectures. Lectures are created by one user but then other users can then "join" the Lectures that have been created. Users have their own profile feed of the Lectures they have created & also have a feed of Lectures friends have created. This question however is not about creating a lecture but rather "Joining" a lecture that has been created already. I've created a "lecturerelationships" model & controller to handle this relationship between Lectures & the Users who have Joined (which i call "actives"). Users also then MUST "Exit" the Lecture (either by clicking "Exit" or navigating to one of the header navigation links). I'm grateful if anyone can work through some of this with me... I've got: Users.rb model Lectures.rb model Users_controller Lectures_controller then the following model lecturerelationship.rb class lecturerelationship < ActiveRecord::Base attr_accessible :active_id, :joinedlecture_id belongs_to :active, :class_name => "User" belongs_to :joinedlecture, :class_name => "Lecture" validates :active_id, :presence => true validates :joinedlecture_id, :presence => true end lecturerelationships_controller.rb class LecturerelationshipsController < ApplicationController before_filter :signed_in_user def create @lecture = Lecture.find(params[:lecturerelationship][:joinedlecture_id]) current_user.join!(@lecture) redirect_to @lecture end def destroy @lecture = Lecturerelationship.find(params[:id]).joinedlecture current_user.exit!(@user) redirect_to @user end end Lectures that have been created (by friends) show up on a users feed in the following file _activity_item.html.erb <li id="<%= activity_item.id %>"> <%= link_to gravatar_for(activity_item.user, :size => 200), activity_item.user %><br clear="all"> <%= render :partial => 'shared/join', :locals => {:activity_item => activity_item} %> <span class="title"><%= link_to activity_item.title, lecture_url(activity_item) %></span><br clear="all"> <span class="user"> Joined by <%= link_to activity_item.user.name, activity_item.user %> </span><br clear="all"> <span class="timestamp"> <%= time_ago_in_words(activity_item.created_at) %> ago. </span> <% if current_user?(activity_item.user) %> <%= link_to "delete", activity_item, :method => :delete, :confirm => "Are you sure?", :title => activity_item.content %> <% end %> </li> Then you see I link to the the 'shared/join' partial above which can be seen in the file below _join.html.erb <%= form_for(current_user.lecturerelationships.build(:joinedlecture_id => activity_item.id)) do |f| %> <div> <%= f.hidden_field :joinedlecture_id %> </div> <%= f.submit "Join", :class => "btn btn-large btn-info" %> <% end %> Some more files that might be needed: config/routes.rb SampleApp::Application.routes.draw do resources :users do member do get :following, :followers, :joined_lectures end end resources :sessions, :only => [:new, :create, :destroy] resources :lectures, :only => [:create, :destroy, :show] resources :relationships, :only => [:create, :destroy] #for users following each other resources :lecturerelationships, :only => [:create, :destroy] #users joining existing lectures So what happens is the lecture comes in my activity_feed with a Join button option at the bottom...which should create a lecturerelationship of an "active" & "joinedlecture" (which obviously are supposed to be coming from the user & lecture classes. But the error i get when i click the join button is as follows: ActiveRecord::StatementInvalid in LecturerelationshipsController#create SQLite3::ConstraintException: constraint failed: INSERT INTO "lecturerelationships" ("active_id", "created_at", "joinedlecture_id", "updated_at") VALUES (?, ?, ?, ?) Also i've included my user model (seems the error is referring to it) user.rb class User < ActiveRecord::Base attr_accessible :email, :name, :password, :password_confirmation has_secure_password has_many :lectures, :dependent => :destroy has_many :lecturerelationships, :foreign_key => "active_id", :dependent => :destroy has_many :joined_lectures, :through => :lecturerelationships, :source => :joinedlecture before_save { |user| user.email = email.downcase } before_save :create_remember_token validates :name, :presence => true, :length => { :maximum => 50 } VALID_EMAIL_REGEX = /\A[\w+\-.]+@[a-z\d\-.]+\.[a-z]+\z/i validates :email, :presence => true, :format => { :with => VALID_EMAIL_REGEX }, :uniqueness => { :case_sensitive => false } validates :password, :presence => true, :length => { :minimum => 6 } validates :password_confirmation, :presence => true def activity # This feed is for "My Activity" - basically lectures i've started Lecture.where("user_id = ?", id) end def friendactivity Lecture.from_users_followed_by(self) end # lECTURE TO USER (JOINING) RELATIONSHIPS def joined?(selected_lecture) lecturerelationships.find_by_joinedlecture_id(selected_lecture.id) end def join!(selected_lecture) lecturerelationships.create!(:joinedlecture_id => selected_lecture.id) end def exit!(selected_lecture) lecturerelationships.find_by_joinedlecture_id(selected_lecture.id).destroy end end Thanks for any and all help - i'll be on here for a while so as mentioned i'd GREATLY appreciate someone who may have the time to work through my issues with me...

    Read the article

  • Double Inner Join generates unexpected error

    - by Itamar Marom
    In my database I have three tables: Users: UserID (Auto Numbering), UserName, UserPassword and a few other unimportant fields. PrivateMessages: MessageID (Auto Numbering), SenderID and a few other fields defining the message content. MessageStatus: MessageID, ReceiverID, MessageWasRead (Boolean) What I need is a query to which I input a user's id and I get all the private messages he has received. In addition, I also need to receive each message's sender UserName. For this I wrote the following query: SELECT Users.*, PrivateMessages.*, MessageStatus.* FROM PrivateMessages INNER JOIN Users ON PrivateMessages.SenderID = Users.UserID INNER JOIN MessageStatus ON PrivateMessages.MessageID = MessageStatus.MessageID WHERE MessageStatus.ReceiverID=[@userid]; But for some reason when I try saving it in my Access database, I get the following error (translated to English by me, since my office is in a different language): Syntax error (missing operator) at expression: "PrivateMessages.SenderID = Users.UserID INNER JOIN MessageStatus ON PrivateMessages.MessageID = MessageStatus.MessageI". Any ideas what could cause this? Thanks.

    Read the article

  • MySQL count problem

    - by Skuja
    I have 3 tables users(id,name),groups(id,name) and users_groups(user_id,group_id). users and groups have many to many relationship, so the third one is for storing users and groups relations. I would like to select all the data from groups with user count in each group. So far I came up with this: SELECT groups.*, COUNT(users_groups.user_id) AS user_count FROM groups LEFT JOIN users_groups ON users_groups.group_id = groups.id The problem is that query result is not returning any of groups which has no users (users_groups doesnt have any records with group_id of those groups). How should I create my query to select all the groups and they user count, or user count as 0 if there are no users for that group?

    Read the article

  • asp.net refresh part of page whit jquery when application state change

    - by Jeson Park
    i use Application property to count number of online users Application["OnlineUsers"] = (int)Application["OnlineUsers"] + 1; and then bind Application property value to HTML tag <div id="OnlineUser" > <span>Number of Online Users is <%=Application["OnlineUsers"].ToString() %></span> </div> when for example "user1" visit page,DIV tag show "Number of Online Users is 1" and "user2" visit page,DIV tag show "Number of Online Users is 2" but user1 still "Number of Online Users is 1" how do i refresh user1 page when user2 change application peroperty?

    Read the article

  • devise forgot password function not working when creating own user controller?

    - by ragupathi
    I use devise for authentication and i have created a user controller and specified as shown below in my routes which lets me to create users,edit and delete users, devise_for :users do resources :users, :only => [:index, :new, :create, :edit, :update, :destroy] end but i cannot able to make the forgot password functionality work using this but in case i specify as devise_for :users then i can able to use the forgot password function that comes with devise and i could not able to create , edit or delete when i specify like this. So how can i make both to work ? please help me

    Read the article

  • Website. VoteUp or VoteDown Videos. How to restrict users voting multiple times?

    - by DJDonaL3000
    Im working on a website (html,css,javascript, ajax, php,mysql), and I want to restrict the number of times a particular user votes for a particular video. Its similar to the YouTube system where you can voteUp or voteDown a particular video. Each vote involves adding a row to the video.votes table, which logs the time, vote direction(up or down), the client IPaddress( using PHP: $ip = $_SERVER['REMOTE_ADDR']; ), and of course the ID of the video in question. Adding votes is as simple as; (pseudocode): Javascript:onClick( vote( a,b,c,d ) ), which passes variables to PHP insertion script via ajax, and finally we replace the voteing buttons with a "Thank You For Voting" message. THE PROBLEM: If you reload/refresh the page after voting, you can vote again, and again, and again, you get the point. MY QUESTION: How do you limit the amount of times a particular user votes for a particular video?? MY THOUGHTS: Do you use cookies, and add a new cookie with the id of the video. And check for a cookie before you insert a new vote.? OR Before you insert the vote, do you use the IPaddress and the videoID to see if this same user(IP) has voted for this same video(vidID) in the past 24hrs(mktime), and either allow or dissallow the voteInsertion based on this query? OR Do you just not care? Take the assumption that most users are sane, and have better things to do than refresh pages and vote repeatedly.?? Any suggestions or ideas welcome.

    Read the article

  • Grails - showing page processing/render debug information in development mode

    - by ovokinder
    I've been experimenting with Grails for the past two days and, so far, I'm really happy with it. Coming from Rails, the only thing I've really been missing here is the dev-mode debug information shown after the page has been served. This is what I mean: Processing UsersController#show (for 127.0.0.1 at 2010-06-14 10:28:44) [GET] Parameters: {"id"=>"2"} User Load (0.0ms) SELECT * FROM "users" WHERE ("users"."id" = 2) Rendering template within layouts/users Rendering users/show Completed in 24ms (View: 5, DB: 0) | 200 OK [http://localhost/users/2] Is there any way to get something similar in Grails? I've tried the "debug" plugin but its not very useful as it only shows the total processing time. I know it's not hard too roll something of my own (except for that database stats part), I just wanted to make sure I wasn't unnecessarily reinventing the wheel.

    Read the article

  • Similar SQL queries returning different results...

    - by Pablo
    Here are the SQL Queries: $sql1 = "SELECT count(thread) AS total FROM comments WHERE thread=1 AND parent_id=0 "; $sql2 = "SELECT count(thread) AS total FROM comments, users WHERE thread=1 AND parent_id=0 AND users.user_id=comments.user_id "; $sql3 = "SELECT comments.*, users.username AS username FROM comments, users WHERE thread=1 AND parent_id=0 AND users.user_id=comments.user_id ORDER BY date LIMIT 10, 5 "; My question is why would $sql1 and $sql2 would return two different results? $sql1 returns 61 rows $sql2 returns 56 rows The 5th line in $sql2 is just for testing, is not required, is just a variation of $sql1 which gets the total rows for a pagination.

    Read the article

  • Json, Timer, Ajax, What is faster (for shared cronometer) ?

    - by Felipe
    Hi everybody, I'm developing an application using ASP.Net. For first the idea: "My WebApp needs an cronometer to be shared by users and all users will se the same value in cronometer. When a user clicks on a button, the cronometer needs to be restarted and all users will need to see that!" All right, now I'd like to know what's the best choose to improve more performace an make sure that all users will see the same value in cronometer ? Need I use JSon (with jquery in client side), Timer with UpdatePanel of Ajax Extensions, pure Ajax (with JQuery) or any idea to suggested ? Any suggestion for how to shared a cronometer for all users in C# (put information in Cache or database) ? Thanks all Cheers

    Read the article

  • wanted to get all dates in mysql result

    - by PankajK
    I have mysql table called user(id, name, join_on) join on is a date field what I want is to show in each day how many uses has been created I can use group by but it will only give me the dates when users get added like if date 4/12/10 5 users added 4/13/10 2 users added 4/15/10 7 users added here date 4/14/10 is missing and I want listing of all dates in one month. I have one solution for it by creating another table only for adding date and that table will left join my users table on join_on and will give total result but I don't want to do that as for creating that I need to create and add entries in date table please suggest the different approach for doing so. Thank you.

    Read the article

  • Select distinct ... inner join vs. select ... where id in (...)

    - by Tonio
    I'm trying to create a subset of a table (as a materialized view), defined as those records which have a matching record in another materialized view. For example, let's say I have a Users table with user_id and name columns, and a Log table, with entry_id, user_id, activity, and timestamp columns. First I create a materialized view of the Log table, selecting only those rows with timestamp some_date. Now I want a materliazed view of the Users referenced in my snapshot of the Log table. I can either create it as select * from Users where user_id in (select user_id from Log_mview), or I can do select distinct u.* from Users u inner join Log_mview l on u.user_id = l.user_id (need the distinct to avoid multiple hits from users with multiple log entries). The former seems cleaner and more elegant, but takes much longer. Am I missing something? Is there a better way to do this?

    Read the article

  • Trying to install mysql then lots of brew doctor errors

    - by gdi2290
    I couldn't install mysql I get this brew install mysql Error: You must `brew link cmake' before mysql can be installed so then I type brew ink cmake Linking /usr/local/Cellar/cmake/2.8.8... Error: Could not symlink file: /usr/local/Cellar/cmake/2.8.8/share/doc/cmake /usr/local/share/doc is not writable. You should change its permissions. when I typed brew doctor I get this Error: Some directories in /usr/local/share/locale aren't writable. This can happen if you "sudo make install" software that isn't managed by Homebrew. If a brew tries to add locale information to one of these directories, then the install will fail during the link step. You should probably chown them: /usr/local/share/locale/ar /usr/local/share/locale/ar/LC_MESSAGES /usr/local/share/locale/be /usr/local/share/locale/be/LC_MESSAGES /usr/local/share/locale/bg /usr/local/share/locale/bg/LC_MESSAGES /usr/local/share/locale/bs /usr/local/share/locale/bs/LC_MESSAGES /usr/local/share/locale/ca /usr/local/share/locale/ca/LC_MESSAGES /usr/local/share/locale/cs /usr/local/share/locale/cs/LC_MESSAGES /usr/local/share/locale/da /usr/local/share/locale/da/LC_MESSAGES /usr/local/share/locale/de /usr/local/share/locale/de/LC_MESSAGES /usr/local/share/locale/de_AT /usr/local/share/locale/de_AT/LC_MESSAGES /usr/local/share/locale/de_CH /usr/local/share/locale/de_CH/LC_MESSAGES /usr/local/share/locale/de_DE /usr/local/share/locale/de_DE/LC_MESSAGES /usr/local/share/locale/el /usr/local/share/locale/el/LC_MESSAGES /usr/local/share/locale/en_AU /usr/local/share/locale/en_AU/LC_MESSAGES /usr/local/share/locale/en_CA /usr/local/share/locale/en_CA/LC_MESSAGES /usr/local/share/locale/en_GB /usr/local/share/locale/en_GB/LC_MESSAGES /usr/local/share/locale/eo /usr/local/share/locale/eo/LC_MESSAGES /usr/local/share/locale/es /usr/local/share/locale/es/LC_MESSAGES /usr/local/share/locale/es_ES /usr/local/share/locale/es_ES/LC_MESSAGES /usr/local/share/locale/es_PE /usr/local/share/locale/es_PE/LC_MESSAGES /usr/local/share/locale/et /usr/local/share/locale/et/LC_MESSAGES /usr/local/share/locale/fi /usr/local/share/locale/fi/LC_MESSAGES /usr/local/share/locale/fr /usr/local/share/locale/fr/LC_MESSAGES /usr/local/share/locale/fr_FR /usr/local/share/locale/fr_FR/LC_MESSAGES /usr/local/share/locale/gl /usr/local/share/locale/gl/LC_MESSAGES /usr/local/share/locale/he /usr/local/share/locale/he/LC_MESSAGES /usr/local/share/locale/hi /usr/local/share/locale/hi/LC_MESSAGES /usr/local/share/locale/hr /usr/local/share/locale/hr/LC_MESSAGES /usr/local/share/locale/hu /usr/local/share/locale/hu/LC_MESSAGES /usr/local/share/locale/hu_HU /usr/local/share/locale/hu_HU/LC_MESSAGES /usr/local/share/locale/id /usr/local/share/locale/id/LC_MESSAGES /usr/local/share/locale/it /usr/local/share/locale/it/LC_MESSAGES /usr/local/share/locale/ja /usr/local/share/locale/ja/LC_MESSAGES /usr/local/share/locale/ka /usr/local/share/locale/ka/LC_MESSAGES /usr/local/share/locale/ko /usr/local/share/locale/ko/LC_MESSAGES /usr/local/share/locale/lv /usr/local/share/locale/lv/LC_MESSAGES /usr/local/share/locale/mr /usr/local/share/locale/mr/LC_MESSAGES /usr/local/share/locale/nb /usr/local/share/locale/nb/LC_MESSAGES /usr/local/share/locale/nds /usr/local/share/locale/nds/LC_MESSAGES /usr/local/share/locale/nl /usr/local/share/locale/nl/LC_MESSAGES /usr/local/share/locale/nn /usr/local/share/locale/nn/LC_MESSAGES /usr/local/share/locale/oc /usr/local/share/locale/oc/LC_MESSAGES /usr/local/share/locale/pl /usr/local/share/locale/pl/LC_MESSAGES /usr/local/share/locale/pt /usr/local/share/locale/pt/LC_MESSAGES /usr/local/share/locale/pt_BR /usr/local/share/locale/pt_BR/LC_MESSAGES /usr/local/share/locale/pt_PT /usr/local/share/locale/pt_PT/LC_MESSAGES /usr/local/share/locale/ro /usr/local/share/locale/ro/LC_MESSAGES /usr/local/share/locale/ru /usr/local/share/locale/ru/LC_MESSAGES /usr/local/share/locale/sk /usr/local/share/locale/sk/LC_MESSAGES /usr/local/share/locale/sr /usr/local/share/locale/sr/LC_MESSAGES /usr/local/share/locale/sv /usr/local/share/locale/sv/LC_MESSAGES /usr/local/share/locale/ta /usr/local/share/locale/ta/LC_MESSAGES /usr/local/share/locale/te /usr/local/share/locale/te/LC_MESSAGES /usr/local/share/locale/tr /usr/local/share/locale/tr/LC_MESSAGES /usr/local/share/locale/uk /usr/local/share/locale/uk/LC_MESSAGES /usr/local/share/locale/vi /usr/local/share/locale/vi/LC_MESSAGES /usr/local/share/locale/zh_CN /usr/local/share/locale/zh_CN/LC_MESSAGES /usr/local/share/locale/zh_HK /usr/local/share/locale/zh_HK/LC_MESSAGES /usr/local/share/locale/zh_TW /usr/local/share/locale/zh_TW/LC_MESSAGES Error: The /usr/local directory is not writable. Even if this directory was writable when you installed Homebrew, other software may change permissions on this directory. Some versions of the "InstantOn" component of Airfoil are known to do this. You should probably change the ownership and permissions of /usr/local back to your user account. Error: "config" scripts exist outside your system or Homebrew directories. ./configure scripts often look for *-config scripts to determine if software packages are installed, and what additional flags to use when compiling and linking. Having additional scripts in your path can confuse software installed via Homebrew if the config script overrides a system or Homebrew provided script of the same name. We found the following "config" scripts: /opt/sm/pkg/active/bin/curl-config /opt/sm/pkg/active/bin/ncurses5-config /opt/sm/pkg/active/bin/ncursesw5-config /opt/sm/pkg/active/bin/pkg-config /opt/sm/pkg/active/bin/xml2-config /opt/sm/pkg/active/bin/xslt-config Error: gettext was detected in your PREFIX. The gettext provided by Homebrew is "keg-only", meaning it does not get linked into your PREFIX by default. If you brew link gettext then a large number of brews that don't otherwise have a depends_on 'gettext' will pick up gettext anyway during the ./configure step. If you have a non-Homebrew provided gettext, other problems will happen especially if it wasn't compiled with the proper architectures. Error: Unbrewed dylibs were found in /usr/local/lib. If you didn't put them there on purpose they could cause problems when building Homebrew formulae, and may need to be deleted. Unexpected dylibs: /usr/local/lib/libboost_filesystem-mt.dylib /usr/local/lib/libboost_serialization-mt.dylib /usr/local/lib/libboost_system-mt.dylib /usr/local/lib/libencfs.6.dylib /usr/local/lib/libintl.8.dylib /usr/local/lib/libmacfuse_i32.2.dylib /usr/local/lib/libmacfuse_i64.2.dylib /usr/local/lib/libosxfuse_i32.2.dylib /usr/local/lib/libosxfuse_i64.2.dylib /usr/local/lib/librlog.5.0.0.dylib Error: Unbrewed .la files were found in /usr/local/lib. If you didn't put them there on purpose they could cause problems when building Homebrew formulae, and may need to be deleted. Unexpected .la files: /usr/local/lib/libosxfuse_i32.la /usr/local/lib/libosxfuse_i64.la Error: Unbrewed .pc files were found in /usr/local/lib/pkgconfig. If you didn't put them there on purpose they could cause problems when building Homebrew formulae, and may need to be deleted. Unexpected .pc files: /usr/local/lib/pkgconfig/osxfuse.pc Error: You have unlinked kegs in your Cellar Leaving kegs unlinked can lead to build-trouble and cause brews that depend on those kegs to fail to run properly once built. cmake Error: Your pkg-config is not checking "/usr/X11/lib/pkgconfig" for packages. Earlier versions of the pkg-config formula did not add this path to the search path, which means that other formula may not be able to find certain dependencies. To resolve this issue, re-brew pkg-config with: brew rm pkg-config && brew install pkg-config Error: You have a non-Homebrew 'pkg-config' in your PATH: /opt/sm/pkg/active/bin/pkg-config ./configure may have problems finding brew-installed packages using this other pkg-config. Error: Your Xcode is configured with an invalid path. You should change it to the correct path. Please note that there is no correct path at this time if you have only installed the Command Line Tools for Xcode. If your Xcode is pre-4.3 or you installed the whole of Xcode 4.3 then one of these is (probably) what you want: sudo xcode-select -switch /Developer sudo xcode-select -switch /Applications/Xcode.app/Contents/Developer DO NOT SET / OR EVERYTHING BREAKS!

    Read the article

  • Network Logon Issues with Group Policy and Network

    - by bobloki
    I am gravely in need of your help and assistance. We have a problem with our logon and startup to our Windows 7 Enterprise system. We have more than 3000 Windows Desktops situated in roughly 20+ buildings around campus. Almost every computer on campus has the problem that I will be describing. I have spent over one month peering over etl files from Windows Performance Analyzer (A great product) and hundreds of thousands of event logs. I come to you today humbled that I could not figure this out. The problem as simply put our logon times are extremely long. An average first time logon is roughly 2-10 minutes depending on the software installed. All computers are Windows 7, the oldest computers being 5 years old. Startup times on various computers range from good (1-2 minutes) to very bad (5-60). Our second time logons range from 30 seconds to 4 minutes. We have a gigabit connection between each computer on the network. We have 5 domain controllers which also double as our DNS servers. Initial testing led us to believe that this was a software problem. So I spent a few days testing machines only to find inconsistent results from the etl files from xperfview. Each subset of computers on campus had a different subset of software issues, none seeming to interfere with logon just startup. So I started looking at our group policy and located some very interesting event ID’s. Group Policy 1129: The processing of Group Policy failed because of lack of network connectivity to a domain controller. Group Policy 1055: The processing of Group Policy failed. Windows could not resolve the computer name. This could be caused by one of more of the following: a) Name Resolution failure on the current domain controller. b) Active Directory Replication Latency (an account created on another domain controller has not replicated to the current domain controller). NETLOGON 5719 : This computer was not able to set up a secure session with a domain controller in domain OURDOMAIN due to the following: There are currently no logon servers available to service the logon request. This may lead to authentication problems. Make sure that this computer is connected to the network. If the problem persists, please contact your domain administrator. E1kexpress 27: Intel®82567LM-3 Gigabit Network Connection – Network link is disconnected. NetBT 4300 – The driver could not be created. WMI 10 - Event filter with query "SELECT * FROM __InstanceModificationEvent WITHIN 60 WHERE TargetInstance ISA "Win32_Processor" AND TargetInstance.LoadPercentage 99" could not be reactivated in namespace "//./root/CIMV2" because of error 0x80041003. Events cannot be delivered through this filter until the problem is corrected. More or less with timestamps it becomes apparent that the network maybe the issue. 1:25:57 - Group Policy is trying to discover the domain controller information 1:25:57 - The network link has been disconnected 1:25:58 - The processing of Group Policy failed because of lack of network connectivity to a domain controller. This may be a transient condition. A success message would be generated once the machine gets connected to the domain controller and Group Policy has successfully processed. If you do not see a success message for several hours, then contact your administrator. 1:25:58 - Making LDAP calls to connect and bind to active directory. DC1.ourdomain.edu 1:25:58 - Call failed after 0 milliseconds. 1:25:58 - Forcing rediscovery of domain controller details. 1:25:58 - Group policy failed to discover the domain controller in 1030 milliseconds 1:25:58 - Periodic policy processing failed for computer OURDOMAIN\%name%$ in 1 seconds. 1:25:59 - A network link has been established at 1Gbps at full duplex 1:26:00 - The network link has been disconnected 1:26:02 - NtpClient was unable to set a domain peer to use as a time source because of discovery error. NtpClient will try again in 3473457 minutes and DOUBLE THE REATTEMPT INTERVAL thereafter. 1:26:05 - A network link has been established at 1Gbps at full duplex 1:26:08 - Name resolution for the name %Name% timed out after none of the configured DNS servers responded. 1:26:10 – The TCP/IP NetBIOS Helper service entered the running state. 1:26:11 - The time provider NtpClient is currently receiving valid time data at dc4.ourdomain.edu 1:26:14 – User Logon Notification for Customer Experience Improvement Program 1:26:15 - Group Policy received the notification Logon from Winlogon for session 1. 1:26:15 - Making LDAP calls to connect and bind to Active Directory. dc4.ourdomain.edu 1:26:18 - The LDAP call to connect and bind to Active Directory completed. dc4. ourdomain.edu. The call completed in 2309 milliseconds. 1:26:18 - Group Policy successfully discovered the Domain Controller in 2918 milliseconds. 1:26:18 - Computer details: Computer role : 2 Network name : (Blank) 1:26:18 - The LDAP call to connect and bind to Active Directory completed. dc4.ourdomain.edu. The call completed in 2309 milliseconds. 1:26:18 - Group Policy successfully discovered the Domain Controller in 2918 milliseconds. 1:26:19 - The WinHTTP Web Proxy Auto-Discovery Service service entered the running state. 1:26:46 - The Network Connections service entered the running state. 1:27:10 – Retrieved account information 1:27:10 – The system call to get account information completed. 1:27:10 - Starting policy processing due to network state change for computer OURDOMAIN\%name%$ 1:27:10 – Network state change detected 1:27:10 - Making system call to get account information. 1:27:11 - Making LDAP calls to connect and bind to Active Directory. dc4.ourdomain.edu 1:27:13 - Computer details: Computer role : 2 Network name : ourdomain.edu (Now not blank) 1:27:13 - Group Policy successfully discovered the Domain Controller in 2886 milliseconds. 1:27:13 - The LDAP call to connect and bind to Active Directory completed. dc4.ourdomain.edu The call completed in 2371 milliseconds. 1:27:15 - Estimated network bandwidth on one of the connections: 0 kbps. 1:27:15 - Estimated network bandwidth on one of the connections: 8545 kbps. 1:27:15 - A fast link was detected. The Estimated bandwidth is 8545 kbps. The slow link threshold is 500 kbps. 1:27:17 – Powershell - Engine state is changed from Available to Stopped. 1:27:20 - Completed Group Policy Local Users and Groups Extension Processing in 4539 milliseconds. 1:27:25 - Completed Group Policy Scheduled Tasks Extension Processing in 5210 milliseconds. 1:27:27 - Completed Group Policy Registry Extension Processing in 1529 milliseconds. 1:27:27 - Completed policy processing due to network state change for computer OURDOMAIN\%name%$ in 16 seconds. 1:27:27 – The Group Policy settings for the computer were processed successfully. There were no changes detected since the last successful processing of Group Policy. Any help would be appreciated. Please ask for any relevant information and it will be provided as soon as possible.

    Read the article

  • Network Logon Issues with Group Policy and Network

    - by bobloki
    I am gravely in need of your help and assistance. We have a problem with our logon and startup to our Windows 7 Enterprise system. We have more than 3000 Windows Desktops situated in roughly 20+ buildings around campus. Almost every computer on campus has the problem that I will be describing. I have spent over one month peering over etl files from Windows Performance Analyzer (A great product) and hundreds of thousands of event logs. I come to you today humbled that I could not figure this out. The problem as simply put our logon times are extremely long. An average first time logon is roughly 2-10 minutes depending on the software installed. All computers are Windows 7, the oldest computers being 5 years old. Startup times on various computers range from good (1-2 minutes) to very bad (5-60). Our second time logons range from 30 seconds to 4 minutes. We have a gigabit connection between each computer on the network. We have 5 domain controllers which also double as our DNS servers. Initial testing led us to believe that this was a software problem. So I spent a few days testing machines only to find inconsistent results from the etl files from xperfview. Each subset of computers on campus had a different subset of software issues, none seeming to interfere with logon just startup. So I started looking at our group policy and located some very interesting event ID’s. Group Policy 1129: The processing of Group Policy failed because of lack of network connectivity to a domain controller. Group Policy 1055: The processing of Group Policy failed. Windows could not resolve the computer name. This could be caused by one of more of the following: a) Name Resolution failure on the current domain controller. b) Active Directory Replication Latency (an account created on another domain controller has not replicated to the current domain controller). NETLOGON 5719 : This computer was not able to set up a secure session with a domain controller in domain OURDOMAIN due to the following: There are currently no logon servers available to service the logon request. This may lead to authentication problems. Make sure that this computer is connected to the network. If the problem persists, please contact your domain administrator. E1kexpress 27: Intel®82567LM-3 Gigabit Network Connection – Network link is disconnected. NetBT 4300 – The driver could not be created. WMI 10 - Event filter with query "SELECT * FROM __InstanceModificationEvent WITHIN 60 WHERE TargetInstance ISA "Win32_Processor" AND TargetInstance.LoadPercentage 99" could not be reactivated in namespace "//./root/CIMV2" because of error 0x80041003. Events cannot be delivered through this filter until the problem is corrected. More or less with timestamps it becomes apparent that the network maybe the issue. 1:25:57 - Group Policy is trying to discover the domain controller information 1:25:57 - The network link has been disconnected 1:25:58 - The processing of Group Policy failed because of lack of network connectivity to a domain controller. This may be a transient condition. A success message would be generated once the machine gets connected to the domain controller and Group Policy has successfully processed. If you do not see a success message for several hours, then contact your administrator. 1:25:58 - Making LDAP calls to connect and bind to active directory. DC1.ourdomain.edu 1:25:58 - Call failed after 0 milliseconds. 1:25:58 - Forcing rediscovery of domain controller details. 1:25:58 - Group policy failed to discover the domain controller in 1030 milliseconds 1:25:58 - Periodic policy processing failed for computer OURDOMAIN\%name%$ in 1 seconds. 1:25:59 - A network link has been established at 1Gbps at full duplex 1:26:00 - The network link has been disconnected 1:26:02 - NtpClient was unable to set a domain peer to use as a time source because of discovery error. NtpClient will try again in 3473457 minutes and DOUBLE THE REATTEMPT INTERVAL thereafter. 1:26:05 - A network link has been established at 1Gbps at full duplex 1:26:08 - Name resolution for the name %Name% timed out after none of the configured DNS servers responded. 1:26:10 – The TCP/IP NetBIOS Helper service entered the running state. 1:26:11 - The time provider NtpClient is currently receiving valid time data at dc4.ourdomain.edu 1:26:14 – User Logon Notification for Customer Experience Improvement Program 1:26:15 - Group Policy received the notification Logon from Winlogon for session 1. 1:26:15 - Making LDAP calls to connect and bind to Active Directory. dc4.ourdomain.edu 1:26:18 - The LDAP call to connect and bind to Active Directory completed. dc4. ourdomain.edu. The call completed in 2309 milliseconds. 1:26:18 - Group Policy successfully discovered the Domain Controller in 2918 milliseconds. 1:26:18 - Computer details: Computer role : 2 Network name : (Blank) 1:26:18 - The LDAP call to connect and bind to Active Directory completed. dc4.ourdomain.edu. The call completed in 2309 milliseconds. 1:26:18 - Group Policy successfully discovered the Domain Controller in 2918 milliseconds. 1:26:19 - The WinHTTP Web Proxy Auto-Discovery Service service entered the running state. 1:26:46 - The Network Connections service entered the running state. 1:27:10 – Retrieved account information 1:27:10 – The system call to get account information completed. 1:27:10 - Starting policy processing due to network state change for computer OURDOMAIN\%name%$ 1:27:10 – Network state change detected 1:27:10 - Making system call to get account information. 1:27:11 - Making LDAP calls to connect and bind to Active Directory. dc4.ourdomain.edu 1:27:13 - Computer details: Computer role : 2 Network name : ourdomain.edu (Now not blank) 1:27:13 - Group Policy successfully discovered the Domain Controller in 2886 milliseconds. 1:27:13 - The LDAP call to connect and bind to Active Directory completed. dc4.ourdomain.edu The call completed in 2371 milliseconds. 1:27:15 - Estimated network bandwidth on one of the connections: 0 kbps. 1:27:15 - Estimated network bandwidth on one of the connections: 8545 kbps. 1:27:15 - A fast link was detected. The Estimated bandwidth is 8545 kbps. The slow link threshold is 500 kbps. 1:27:17 – Powershell - Engine state is changed from Available to Stopped. 1:27:20 - Completed Group Policy Local Users and Groups Extension Processing in 4539 milliseconds. 1:27:25 - Completed Group Policy Scheduled Tasks Extension Processing in 5210 milliseconds. 1:27:27 - Completed Group Policy Registry Extension Processing in 1529 milliseconds. 1:27:27 - Completed policy processing due to network state change for computer OURDOMAIN\%name%$ in 16 seconds. 1:27:27 – The Group Policy settings for the computer were processed successfully. There were no changes detected since the last successful processing of Group Policy. Any help would be appreciated. Please ask for any relevant information and it will be provided as soon as possible.

    Read the article

  • In Flex, how to drag a component into a column of DataGrid (not the whole DataGrid)?

    - by Yousui
    Hi guys, I have a custom component: <?xml version="1.0" encoding="utf-8"?> <s:Group xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/mx"> <fx:Declarations> </fx:Declarations> <fx:Script> <![CDATA[ [Bindable] public var label:String = "don't know"; [Bindable] public var imageName:String = "x.gif"; ]]> </fx:Script> <s:HGroup paddingLeft="8" paddingTop="8" paddingRight="8" paddingBottom="8"> <mx:Image id="img" source="assets/{imageName}" /> <s:Label text="{label}"/> </s:HGroup> </s:Group> and a custom render, which will be used in my DataGrid: <?xml version="1.0" encoding="utf-8"?> <s:MXDataGridItemRenderer xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/mx" focusEnabled="true" xmlns:components="components.*"> <s:VGroup> <components:Person label="{dataGridListData.label}"> </components:Person> </s:VGroup> </s:MXDataGridItemRenderer> This is my application: <?xml version="1.0" encoding="utf-8"?> <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/mx" minWidth="955" minHeight="600" xmlns:services="services.*"> <s:layout> <s:VerticalLayout/> </s:layout> <fx:Script> <![CDATA[ import mx.collections.ArrayCollection; import mx.controls.Alert; import mx.controls.Image; import mx.rpc.events.ResultEvent; import mx.utils.ArrayUtil; ]]> </fx:Script> <fx:Declarations> <fx:XMLList id="employees"> <employee> <name>Christina Coenraets</name> <phone>555-219-2270</phone> <email>[email protected]</email> <active>true</active>  </employee> <employee> <name>Joanne Wall</name> <phone>555-219-2012</phone> <email>[email protected]</email> <active>true</active>  </employee> <employee> <name>Maurice Smith</name> <phone>555-219-2012</phone> <email>[email protected]</email> <active>false</active>  </employee> <employee> <name>Mary Jones</name> <phone>555-219-2000</phone> <email>[email protected]</email> <active>true</active>  </employee> </fx:XMLList> </fx:Declarations> <s:HGroup> <mx:DataGrid dataProvider="{employees}" width="100%" dropEnabled="true"> <mx:columns> <mx:DataGridColumn headerText="Employee Name" dataField="name"/> <mx:DataGridColumn headerText="Email" dataField="email"/> <mx:DataGridColumn headerText="Image" dataField="image" itemRenderer="renderers.render1"/> </mx:columns> </mx:DataGrid> <s:List dragEnabled="true" dragMoveEnabled="false"> <s:dataProvider> <s:ArrayCollection> <fx:String>aaa</fx:String> <fx:String>bbb</fx:String> <fx:String>ccc</fx:String> <fx:String>ddd</fx:String> </s:ArrayCollection> </s:dataProvider> </s:List> </s:HGroup> </s:Application> Now what I want to do is let the user drag an one or more item from the left List component and drop at the third column of the DataGrid, then using the dragged data to create another <components:Person /> object. So in the final result, maybe the first line contains just one <components:Person /> object at the third column, the second line contains two <components:Person /> object at the third column and so on. Can this be implemented in Flex? How? Great thanks.

    Read the article

  • Search Form using PHP/mySQL and Fancybox iFrame

    - by Cocoonfxmedia
    I am struggling to get a search form in PHP to work with a fancybox iFrame. The search form queries a MySQL database and I want the results to show in the iFrame. However for some reason i can not get this to work. I have pasted the complete code below. Appreciate any support with this? Java Script: <script type="text/javascript"> $(document).ready(function() { $("#tip5").fancybox({ 'scrolling' : 'no', 'titleShow' : false, 'onClosed' : function() { $("#login_error").hide(); } }); }); </script> HTML/PHP <form method="post" action="test2.php?go" id="topF"> <input type="text" name="name"> <input type="submit" name="submit" id="tip5" href="#results" value="Search" class="buttonSubmit"> </form> <div id="results"> <?php if(isset($_POST['submit'])){ if(isset($_GET['go'])){ if(preg_match("/[A-Z | a-z]+/", $_POST['name'])){ $name=$_POST['name']; //-query the database table $sql="SELECT id, COMPANY, TOWN, COUNTY, ServiceStandards,Active FROM main WHERE Active='Y' AND COMPANY LIKE '%" . $name . "%' OR TOWN LIKE '%" . $name . "%'AND Active='Y' OR ServiceStandards LIKE '%" . $name . "%' AND Active='Y' OR COUNTY LIKE '%" . $name . "%'AND Active='Y'"; $result5=mysql_query($sql); $numrows=mysql_num_rows($result5); echo "<p>" .$numrows . " results found for " . stripslashes($name) . "</p>"; //-create while loop and loop through result set while($numrow=mysql_fetch_array($result5)){ $Company =$numrow['COMPANY']; $Town=$numrow['TOWN']; $County=$numrow['COUNTY']; $ID=$numrow['id']; $Service=$numrow['ServiceStandards']; //-display the result of the array echo "<ul>\n"; echo "<li>" . "<a href=\"test2.php?id=$ID\">" .$Company . "</a></li>\n"; echo "</ul>"; } } else{ echo "<p>Please enter a search query</p>"; } } } if(isset($_GET['by'])){ $letter=$_GET['by']; $sql="SELECT *,COMPANY,COMPANY+0 FROM main WHERE Active='Y' AND COMPANY LIKE '" . $letter . "%' ORDER BY COMPANY"; $result5=mysql_query($sql); $result6=mysql_query($sql); $rows=mysql_num_rows($result6); echo "<p>" .$numrows . " results found for " . $letter . "</p>"; while($rows=mysql_fetch_array($result6)){ $Company =$rows['COMPANY']; $Town=$rows['TOWN']; $County=$rows['COUNTY']; $Post=$rows['POSTCODE']; $ID=$rows['id']; //-display the result of the array echo "<ul>\n"; echo "<li>" . "<a href=\"test2.php?id=$ID\">" .$Company . "</a> <br/>" .$Town . ", " .$County. ", " .$Post."</li>\n"; echo "</ul>"; } } if(isset($_GET['id'])){ $contactid=$_GET['id']; $sql="SELECT * FROM main WHERE Active='Y' AND id=" . $contactid; $result7=mysql_query($sql); while($row=mysql_fetch_array($result7)){ $Company =$row['COMPANY']; $Town=$row['TOWN']; $County=$row['COUNTY']; $Email=$row['Email']; $Web=$row['Web']; //-display the result of the array echo 'Company Name: '. $row['COMPANY'] . '<br />'; echo 'Town: '. $row['TOWN'] . '<br/>'; echo 'Postcode: '. $row['POSTCODE'] . '<br/>'; echo 'Telephone: '. $row['TELEPHONE'] . '<br/>'; echo 'Email: '. $row['Email'] . '<br/>'; echo 'Web: ' . "<a href=".$Web.">" .$Web . "</a> <br/>\n"; echo ' '. '<br/>'; echo 'Service: '. $row['ServiceStandards'] . '</br>'; } } ?> </div>

    Read the article

  • Webcast Q&A: Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter

    - by Kellsey Ruppel
    This week we had the fifth webcast in our WebCenter in Action webcast series, "Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter", where customers Giovani Dacumos and Minh Ong from the Los Angeles Department of Building & Safety (LADBS), and Sheetal Paranjpye and Rajiv Desai from Oracle Partner 3Di, shared how Oracle WebCenter is powering LADBS' externally facing website and providing a superior self-service experience for their customers. We asked the speakers to provide some dialogue for Q&A.   Giovani Dacumos, Director of Systems and Minh Ong, LADBS Q: Did you run into any issues when integrating all of the different applications together?A: Yes. We did have issues integrating a secure sign on between the portal and other legacy applications. We used portlets and iframes to overcome those.  This is a new technology for us and we are also learning as we go so there were a lot of challenges in developing and implementing our vision. Q: What has been the biggest benefit your end users have seen?A: The biggest benefit for our ends users is ease-of-use. We've given them a system that provided a new and improved source of information, as well as a very organized flow of transaction processing. It has made our online service very user friendly. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: There was no internal resistance during the implementation, only challenges. As mentioned earlier, this is a new technology for us. We've come across issues that needed assistance from Oracle. Working with 3Di and Oracle has helped us tremendously to find solutions to our implementation issues. Q: Given the performance, what do you estimate to be the top end capacity of the system? A: With the current performance and architecture we have, we are able to support approx 300-400 concurrent users.  We would need more hardware to support additional user load. Q: What's the overview or summary of feedback from the users interacting with the site?A: LADBS has a wide spectrum of customers, from simple users like homeowners to large construction firms. Anything new that we offer could be a little bit challenging for some, but overall, the customers liked it. They saw a huge improvement on the usability. Q: Can you describe the impressions about the site before and after the project within LADBS?A: The old site was using old technology and it was hard for us to keep on building into it as we got more business requirements. It made our application seem a bit complicated.  It was confusing for our new customers to use and we've improved on this with the new site. It's now easier for them to complete their transactions and, at the same time, allowed us to provide more useful information. Sheetal Paranjpye and Rajiv Desai, 3Di Q: Did you run into any obstacles when implementing the solution?A: Yes we did run into some obstacles. One of the key show stoppers was the issue with portlet to portal communication. The GIS viewer (portlet) needed information to be passed  to and from Permit LA (Portal), but we were able to get everything configured and up and working quickly! Q: Was there a lot of custom work that needed to be done for this particular solution?A: We have done some customizations where workflows/ Task flows are involved.  Q: What do you think were the keys to success for rolling out WebCenter?A: Having a service oriented architecture and using portlets have been the key areas for rolling out Oracle WebCenter at LADBS. The Oracle WebCenter Content integration allows the flexibility to business users to maintain the content, which has really cut down on the reliance of IT, and employee productivity has increased as a result. If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action! Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter from Oracle WebCenter

    Read the article

  • New security options in UCM Patch Set 3

    - by kyle.hatlestad
    While the Patch Set 3 (PS3) release was mostly focused on bug fixes and such, some new features sneaked in there. One of those new features is to the security options. In 10gR3 and prior versions, UCM had a component called Collaboration Manager which allowed for project folders to be created and groups of users assigned as members to collaborate on documents. With this component came access control lists (ACL) for content and folders. Users could assign specific security rights on each and every document and folder within a project. And it was even possible to enable these ACL's without having the Collaboration Manager component enabled (see technote# 603148.1). When 11g came out, Collaboration Manager was no longer available. But the configuration settings to turn on ACLs were still there. Well, in PS3 they're implemented slightly differently. And there is a new component available which adds an additional dimension to define security on the object, Roles. So now instead of selecting individual users or groups of users (defined as an Alias in User Admin), you can select a particular role. And if a user has that role, they are granted that level of access. This can allow for a much more flexible and manageable security model instead of trying to manage with just user and group access as people come and go in the organization. The way that it is enabled is still through configuration entries. First log in as an administrator and go to Administration -> Admin Server. On the Component Manager page, click the 'advanced component manager' link in the description paragraph at the top. In the list of Disabled Components, enable the RoleEntityACL component. Then click the General Configuration link on the left. In the Additional Configuration Variables text area, enter the new configuration values: UseEntitySecurity=true SpecialAuthGroups=<comma separated list of Security Groups to honor ACLs> The SpecialAuthGroups should be a list of Security Groups that honor the ACL fields. If an ACL is applied to a content item with a Security Group outside this list, it will be ignored. Save the settings and restart the instance. Upon restart, three new metadata fields will be created: xClbraUserList, xClbraAliasList, xClbraRoleList. If you are using OracleTextSearch as the search indexer, be sure to run a Fast Rebuild on the collection. On the Check In, Search, and Update pages, values are added by simply typing in the value and getting a type-ahead list of possible values. Select the value, click Add and then set the level of access (Read, Write, Delete, or Admin). If all of the fields are blank, then it simply falls back to just Security Group and Account access. For Users and Groups, these values are automatically picked up from the corresponding database tables. In the case of Roles, this is an explicitly defined list of choices that are made available. These values must match the role that is being defined from WebLogic Server or you LDAP/AD repository. To add these values, go to Administration -> Admin Applets -> Configuration Manager. On the Views tab, edit the values for the ExternalRolesView. By default, 'guest' and 'authenticated' are added. Once added to through the view, they will be available to select from for the Roles Access List. As for how they are stored in the metadata fields, each entry starts with it's identifier: ampersand (&) symbol for users, "at" (@) symbol for groups, and colon (:) for roles. Following that is the entity name. And at the end is the level of access in paranthesis. e.g. (RWDA). And each entry is separated by a comma. So if you were populating values through batch loader or an external source, the values would be defined this way. Detailed information on Access Control Lists can be found in the Oracle Fusion Middleware System Administrator's Guide for Oracle Content Server.

    Read the article

  • Using the ASP.NET Membership API with SQL Server / SQL Azure: The new &ldquo;System.Web.Providers&rdquo; namespace

    - by Harish Ranganathan
    The Membership API came in .NET 2.0 and was a huge enhancement in building web applications with users, managing roles, permissions etc.,  The Membership API by default uses SQL Express and until Visual Studio 2008, it was available only through the ASP.NET Configuration manager screen (Website – ASP.NET Configuration) or (Project – ASP.NET Configuration) and for every application, one has to manually visit this place to start using the Security and other settings.  Upon doing that the default SQL Express database aspnet.mdf is created to store all the user profiles. Starting Visual Studio 2010 and .NET 4.0, the Default Website template includes the Membership API controls as a part of the page i.e. When you create a “File – New – ASP.NET Web Application” or an “ASP.NET MVC Application”, by default the Login/Register controls are enabled in the MasterPage and they are termed under “ApplicationServices” setting in the web.config file with connection string pointed to the SQL Express database. In fact, when you run the default website and click on “Logon” –> “Register”, and enter the details for registration and click “Register”, that is the time the aspnet.mdf file is created with the tables for Users, Roles, UsersInRoles, Profile etc., Now, this uses the default SQL Express database within the App_Data folder.  If you want to move your Membership information to some other database such as SQL Server, SQL CE or SQL Azure, you need to manually run the aspnet_regsql command and specify the destination database name. This would create all the Tables, Procedures and Views required to handle the Membership information.  Thereafter you can change the connection string for “ApplicationServices” to point to the database where you had run all the scripts. Now, enter “System.Web.Providers” Alpha. This is available as a part of the NuGet package library.  Scott Hanselman has a neat post describing the steps required to get it up and running as well as doing the basic changes  at http://www.hanselman.com/blog/IntroducingSystemWebProvidersASPNETUniversalProvidersForSessionMembershipRolesAndUserProfileOnSQLCompactAndSQLAzure.aspx Pretty much, it covers what the new System.Web.Providers do. One thing I wanted to clarify is that, the new “System.Web.Providers” add a lot of new settings which are also marked as the defaults, in the web.config.  Even now, they use SQL Express as the default database.  But, if you change the connection string for “DefaultConnection” under connectionStrings to point to your SQL Server or SQL Azure, Membership API would now be able to create all the tables, procedures and views at the destination specified (i.e. SQL Server or SQL Azure). In my case, I modified the DefaultConneciton to point to my SQL Azure database.  Next, I hit F5 to run the application.  The default view loads.  I clicked on “LogOn” and then “Register” since I knew there are no tables/users as of then.  One thing to note is that, I had put “NewDB” as the database name in the connection string that points to SQL Azure.  NewDB wasn’t existing and I would assume it would be created before the tables/views/procedures for Membership are created. Once I clicked on the “Register” to register my first username, it took a while and then registered as well as logged in me in.  Also, I went to the SQL Azure Management Portal and verified that there exists “NewDB” which has just been created I could also connect to the SQL Azure database “NewDB” from Management Studio and found that the tables now don’t have the aspnet_ prefix.  The tables were simply Users, Roles, UsersInRoles, Profiles etc., So, with a few clicks and configuration change, I could actually set up the user base for my application on SQL Azure and even make the SessionState, Roles, Profiles being stored in SQL Azure database. The new System.Web.Proivders also required MARS (MultipleActiveResultSets=true) setting since it uses Entity Framework for the DAL operations.  Also, the “Project – ASP.NET Configuration” screen can be used to further create/manage users/roles etc., although the data is stored on the remote database. With that, a long pending request from the community to have the ability to configure and use remote databases for Application users management without having to run the scripts from SQL Express is fulfilled. Cheers !!!

    Read the article

  • How to make software development decisions based on facts

    - by Laila
    We love to hear stories about the many and varied ways our customers use the tools that we develop, but in our earnest search for stories and feedback, we'd rather forgotten that some of our keenest users are fellow RedGaters, in the same building. It was almost by chance that we discovered how the SQL Source Control team were using SmartAssembly. As it happens, there is a separate account (here on Simple-Talk) of how SmartAssembly was used to support the Early Access program; by providing answers to specific questions about how the SQL Source Control product was used. But what really got us all grinning was how valuable the SQL Source Control team found the reports that SmartAssembly was quickly and painlessly providing. So gather round, my friends, and I'll tell you the Tale Of The Framework Upgrade . <strange mirage effect to denote a flashback. A subtle background string of music starts playing in minor key> Kevin and his team were undecided. They weren't sure whether they could move their software product from .NET 2 to .NET 3.5 , let alone to .NET 4. You see, they were faced with having to guess what version of .NET was already installed on the average user's machine, which I'm sure you'll agree is no easy task. Upgrading their code to .NET 3.5 might put a barrier to people trying the tool, which was the last thing Kevin wanted: "what if our users have to download X, Y, and Z before being able to open the application?" he asked. That fear of users having to do half an hour of downloads (.followed by at least ten minutes of installation. followed by a five minute restart) meant that Kevin's team couldn't take advantage of WCF (Windows Communication Foundation). This made them sad, because WCF would have allowed them to write their code in a much simpler way, and in hours instead of days (as was the case with .NET 2). Oh sure, they had a gut feeling that this probably wasn't the case, 3.5 had been out for so many years, but they weren't sure. <background music switches to major key> SmartAssembly Feature Usage Reporting gave Kevin and his team exactly what they needed: hard data on their users' systems, both hardware and software. I was there, I saw it happen, and that's not the sort of thing a woman quickly forgets. I'll always remember his last words (before he went to lunch): "You get lots of free information by just checking a box in SmartAssembly" is what he said. For example, they could see how many CPU cores their customers were using, and found out that they should be making use of parallelism to take advantage of available cores. But crucially, (and this is the moral of my tale, dear reader), Kevin saw that 99% of SQL Source Control's users were on .NET 3.5 or above.   So he knew that they could make the switch and that is was safe to do so. With this reassurance, they could use WCF to not only make development easier, but to also give them a really nice way to do inter-process communication between the Source Control and the SQL Compare products. To have done that on .NET 2.0 was certainly possible <knowing chuckle>, but Microsoft have made it a lot easier with WCF. <strange mirage effect to denote end of flashback> So you see, with Feature Usage Reporting, they finally got the hard evidence they needed to safely make the switch to .NET 3.5, knowing it would not inconvenience their users. And that, my friends, is just the sort of thing we like to hear.

    Read the article

< Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >