Today after installing patches to the WSUS server, it stopped working after the restart.
What I should have done, is take a nice backup of the System state before installing the patches!
In any case, upon restart the machine came up, but the following types of errors came up:
I had to set the Web Publishing service to NOT allow desktop interaction
ASP.NET 2.0.50727.0 EventId 1314 appeared in the event log with the following error:
Even code: 4011
Event message: An unhandled access exception has occurred.
..... etc.
I got to the point where I had no authentiation mechanism, eventhough all services were running. No users in the WSUS local groups... strange...
So, rather than removing the patches, I have uninstalled WSUS and reinstalled (leaving all the information in-tact, db, log and one other option I can't remember). The problem with this is that I will still most likely have the same underlying issue and need to manually uninstall the updates I have applied before the installation.
Upon second attempt at reinstallation... it worked.
Friday, January 04, 2008
Friday, December 28, 2007
Migrating over from a 2000 Domain Controller to a 2003 Domain Controller
http://technet2.microsoft.com/windowsserver/en/library/99f53498-ce25-4ab4-b476-7aa6e1997d641033.msp x?mfr=true
http://support.microsoft.com/kb/555549
http://support.microsoft.com/kb/555549
Tuesday, October 16, 2007
Control your data on a 2003 file server
I came across this article at Tech Republic and am impressed. I will be implementing this shortly in a network and follow-up with my findings.
The ability to restrict not just mp3s, but perhaps make it so only certain types of file can be saved in certain locations is what I am really after.
Run Internet Explorer on *nix
Hmm this could be an interesting way to mitigate MS saturation in to a network. The downside is that the legal use of it still requires an MS OS license.
Monday, October 15, 2007
IIS 6.0 Nuances
Today while rearranging some IIS websites I came across a particularly frustrating challenge.
The work that was carried out involved adding an additional IP (and NIC) to our web server. Host headers were not an option because the sites use SSL. After applying all the necessary NIC and firewall settings, the pre-existing website, when assigned to this new IP address, would not start. It came back with the message stating that there was a potential conflict using the same ports.
Interestingly, I was able to create a new site which would start. I created this site with the ip from the setup wizard. So what seems to have happened here is the metabase did not get updated appropriately for the pre-existing site. rather than utilise a tool such as httpconfig in this situation, it was far quicker to simply create the new site. If the websites were many, or were part of a cluster, then using the httpconfig windows support tool would have almost been needed.
Thursday, October 11, 2007
Firefox and NTLM Authentication
A network I support has a number of Firefox users. Recently after strengthening their GP settings to enhance security, Firefox users started to experience issues logging in to IIS servers that use intregrated authentication. This is because by default Digest Authentication is used by FireFox. In order to enable NTLM authentication in Firefox, I used the information in this link: http://www.cauldwell.net/patrick/blog/PermaLink,guid,c7f1e799-c4ae-4758-9de7-5c3e7a16f3da.aspx
Basically, network.automatic-ntlm-auth.trusted-uris setting in FireFox needs to have the servers listed for which NTLM authentication will be required.
If Proxies are required, this setting will be of use: network.automatic-ntlm-auth.allow-proxies
For those wanting a more original source of information, please see the mozilla knowledge base: http://kb.mozillazine.org/Network.automatic-ntlm-auth.trusted-uris
Monday, October 08, 2007
The Powershell
On of the nice things I have found useful with the powershell, is the resiliency it provides over a command prompt.
A while ago I was copying some directory structures that have some archaic deny permissions set here and there. In a scramble (You can tell this environment is organized, can't you?) to copy the folder structure and files, I did something like:
xcopy c:\FolderParent\* e:\NewFolder\ /E
Unfortuneately, at the first access-denied problem, the copy process gets aborted. Nor does any information get displayed about where the copy stopped.
The Powershell allows you to do something like:
copy-item c:\FolderParent\ e:\NewFolder\ -recurse
This will output any errors in red (by default) but continue on copying the rest of the data (which is listed by default). You can also tack on the -exclude command to leave out files AND folders that have the specified string (like *.txt).
Now if I take the time, I should be able to doctor up a nice master script that will output a complete log, and an estimated time to copy x GBs !
The Glory of a UPS
Since I have last posted I have passed my 70-290 exam.
I am now working on the 70-291 and CCNA 60-801 exams. The 70-291 seems to be mostly about technical knowledge and the use of Network Monitor.
Back when I passed the 70-290 I had registered for the CCNA-INTRO exam. I missed a PASS by about 2 questions (12 points or something) which was a bit of a bummer. So, now I am going to take the CCNA full-on in the next few weeks, as the exam completely changes in early November 2007 and I REALLY do not want to lose any study hours I have put in thus far!
So last week I migrated a box that had the role of file server. This basically meant that most of the company's important stuff was there, and everything else scattered about where there is space. In the throws of a last-minute rushed migration, I hooked up one of our nice new APC UPS's. Coincidentally the following week (This Monday morning!) the local power company decides to yo-yo the grid for the entire morning. The bonus? Our UPS works wonderfully. The drawback, everybody still experienced down-time at their workstations.
It has been a late night of studying for the CCNA. Now I am off to bed and ready to take on tomorrows onslaught of machinistic demands :-) Now if I could just learn how to master a PIX in a couple hours... That would bode well for my network overhaul project!
I am now working on the 70-291 and CCNA 60-801 exams. The 70-291 seems to be mostly about technical knowledge and the use of Network Monitor.
Back when I passed the 70-290 I had registered for the CCNA-INTRO exam. I missed a PASS by about 2 questions (12 points or something) which was a bit of a bummer. So, now I am going to take the CCNA full-on in the next few weeks, as the exam completely changes in early November 2007 and I REALLY do not want to lose any study hours I have put in thus far!
So last week I migrated a box that had the role of file server. This basically meant that most of the company's important stuff was there, and everything else scattered about where there is space. In the throws of a last-minute rushed migration, I hooked up one of our nice new APC UPS's. Coincidentally the following week (This Monday morning!) the local power company decides to yo-yo the grid for the entire morning. The bonus? Our UPS works wonderfully. The drawback, everybody still experienced down-time at their workstations.
It has been a late night of studying for the CCNA. Now I am off to bed and ready to take on tomorrows onslaught of machinistic demands :-) Now if I could just learn how to master a PIX in a couple hours... That would bode well for my network overhaul project!
Thursday, June 07, 2007
Virtualization Options
The below link is a nice starting point for finding out more about virtualization:
http://www.petri.co.il/virtual_virtualization_options_compared.htm
FindString Performance
Today I have been tasked with finding out if a particular string exists in a very large amount of data (18GB) that is not indexed. The files are preparation files before being loaded in to a database. The real shortcoming I find with findstr compared to grep is that findstr does not allow you to search select subdirectories without creating a batch file to script the work. With grep, this can be done from the cli.
Tuesday, May 16, 2006
Domain Account Policy Password Settings
If you ever have an organization with multiple password complexity requirements, definitely consider a third party tool to handle the job for you. I came across a domain setup recently where the password and lockout policy had been set, but there were a select group of restricted users with a no-password requirement.
*Note, do not set up a policy without password restrictions/requirements.
The problem with this is, one is able to set only 1 password policy for hte entire domain. So, short of a third party utility or another domain, here is what I did as a workaround.
First, make sure you have documented the policy before making adjustments.
Temporarily disable the password policy (Do this out of hours) by setting minimum lengths, durations etcetera to zero. If you simply disable the policy, the previous setting will remain in effect, and you will be unable to adjust users passwords in the interim. Now that this is done, you are able to successfully set zero-length passwords to your select group of restricted users. Once this is done, return the password policy settings to what they were.
As an alternative (Yet equally insecure) for accessability. This solution is good for users who only use one device that nobody else will (Such as a communcation aid/talker). Remember though, the password gets stored in plain text which is a big no-no.
*Edit the registry solely at your own risk!
Go to the following key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon
1) Add a Value called DefaultPassword
2) Make the data type REG_SZ
3) Type in the user password in the string editor
4) Add another Value called AutoAdminLogon
5) Data type as REG_SZ
6) Set the value to 1
7) Set the forceautologon key to 1
When you restart the machine, you should be automatically logged in to the default domain.
*Note, do not set up a policy without password restrictions/requirements.
The problem with this is, one is able to set only 1 password policy for hte entire domain. So, short of a third party utility or another domain, here is what I did as a workaround.
First, make sure you have documented the policy before making adjustments.
Temporarily disable the password policy (Do this out of hours) by setting minimum lengths, durations etcetera to zero. If you simply disable the policy, the previous setting will remain in effect, and you will be unable to adjust users passwords in the interim. Now that this is done, you are able to successfully set zero-length passwords to your select group of restricted users. Once this is done, return the password policy settings to what they were.
As an alternative (Yet equally insecure) for accessability. This solution is good for users who only use one device that nobody else will (Such as a communcation aid/talker). Remember though, the password gets stored in plain text which is a big no-no.
*Edit the registry solely at your own risk!
Go to the following key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon
1) Add a Value called DefaultPassword
2) Make the data type REG_SZ
3) Type in the user password in the string editor
4) Add another Value called AutoAdminLogon
5) Data type as REG_SZ
6) Set the value to 1
7) Set the forceautologon key to 1
When you restart the machine, you should be automatically logged in to the default domain.
Wednesday, April 05, 2006
Logging Software
If you are looking for a software auditing tool/alternative to MS SMS, have a look at LOGINventory. It comes with a free 20 client license and gives nice reports. If you intend to audit more than 20 devices however, you need to purchase additional licenses. A great tool for a small business network!
unable to find network path remote registry
Earlier today I was unable to access a remote client to view the event viewer. I was, however, able to access it with remote desktop. My first check was to see if remote registry was enabled and indeed it was. Each time I tried to connect to another computer... I was faced with the long pause and eventual message "Network Path Not Found".
The solution was to ensure that File & Printer sharing is enabled on the network adapter.
The solution was to ensure that File & Printer sharing is enabled on the network adapter.
Tuesday, January 31, 2006
UPS setup with secondary server remote shutdown
Recently one of the locations I support finally acquired some UPSs. As the result of a few surges the other week, they no longer wanted to pay me to rebuild a server (Or hear I told you so).
We have a Belkin SurgeMaster 500VA Battery Backup connected via usb to a windows 2000 server. The UPS is also providing power to our Linux (Karoshi http://www.karoshi.org.uk ) intranet and proxy/filtering server. I chose to setup the UPS monitor up through the 2000 box because, in all likely-hood, when I move on the next person may not know much about linux. Or even worse, a consultant comes in to charge extra $$ because their UPS is hooked up via linux. The drawback in working in a school is funds. A passing thought is I wonder where all the extra UK petrol tax is going since the price increases... Anyways, back to work.
After installing and setting up the 2000 server as the UPS monitor, I needed a way to safely shutdown the linux box. This was achieved by setting up an ssh tunnel using plink (A command-line version of PuTTy) and executing:
shutdown -h now
Sounds easy enough except our linux box authenticates everything through kerberos LDAP through a 2000 domain controller, so it wasn't a matter of setting up the batch file to login and shutdown, because groups aren't synched. In short, there was no easy/quick way to get either sudoer hooked in, or to enter a second password through a tty (Which is good security).
What was needed:
Once we have the ssh server setup, try logging into the local machine as a client. If this isn't successful, you most likely have to start sshd by running the following as root:
/etc/init.d/sshd
If you still cannot login locally, it's time to go google and read the various man pages.
After downloading plink to the powerdown folder, open up a command prompt on the 2000 server (start, run, cmd). change directory to plink and attempt a login by typing:
C:/powerdown> plink linuxCompName -l username -pw ******
You should have a shell after accepting the prompt to accept the key. Now we know we can securely log in to our linux box from windows, yay! Go celebrate with caffiene or have a play around with your shell.
In order to shell in as root, we have to make a change in our sshd configuration file, located at /etc/ssh/sshd_config. Change PermitRootLogin no, to PermitRootLogin yes.
Because we want the shutdown to be fully automated, we need to produce a batch file that gets called by the UPS monitor. On the 2k server, pen up notepad and enter the following:
plink linuxBoxName -batch -l root -pw plaintextPassword!? -m linuxHalt.txt
Save the file as "upsShutdown.bat" and make sure to include the quotes. Otherwise the file will save as a text file.
The -batch option disables any confirmation messages. The -l option indicates who will log in. The -pw switch signals password.
Wait a minute! A plaintext file with our coveted root login details!?!? OBVIOUSLY, this is not best practice! The scenario here involves a need to login as root because the authentication scheme is only handling LDAP domain user accounts, not the local accounts on the linux box. If I had more time to spend on this I would come up with something better. With that said, if somebody has a simpler solution please let me know! Unfortunately, I don't have the time to implement the most secure solution. Whatever changes, screwups, or problems you have in your network or on your computers because of this information is ONLY your fault. Follow these instructions at your own risk/peril/busted/hacked network!
Of course, the powerdown folder being on an NTFS network, tighten your permissions to only allow SYSTEM, and whatever user your UPS monitor software runs as with Read and Execute only.
Remember, we need to have a way to pass commands once the terminal is open. This is done through the -m option in the batch file. It will open linuxHalt.txt and execute the commands in that file on the remote linux server. Our text file simply has the one line in it:
shutdown -h now
Save your changes.
Now, make sure you have any files open on your linuxbox saved and execute the batch file. This will simulate the UPS calling the powerdown script. You should see your linuxbox go down immediately. You can change out the now parameter with -t 5 which would make it shutdown in 5 minutes.
Finally, we ensure that our UPS software (Belkin in this case) executes the batch file by entering the full path to the batch script (ie c:\powerdown\batchscript.bat). If you have a test environment test it. If not, wait until after hours (duh) and test it live.
We have a Belkin SurgeMaster 500VA Battery Backup connected via usb to a windows 2000 server. The UPS is also providing power to our Linux (Karoshi http://www.karoshi.org.uk ) intranet and proxy/filtering server. I chose to setup the UPS monitor up through the 2000 box because, in all likely-hood, when I move on the next person may not know much about linux. Or even worse, a consultant comes in to charge extra $$ because their UPS is hooked up via linux. The drawback in working in a school is funds. A passing thought is I wonder where all the extra UK petrol tax is going since the price increases... Anyways, back to work.
After installing and setting up the 2000 server as the UPS monitor, I needed a way to safely shutdown the linux box. This was achieved by setting up an ssh tunnel using plink (A command-line version of PuTTy) and executing:
shutdown -h now
Sounds easy enough except our linux box authenticates everything through kerberos LDAP through a 2000 domain controller, so it wasn't a matter of setting up the batch file to login and shutdown, because groups aren't synched. In short, there was no easy/quick way to get either sudoer hooked in, or to enter a second password through a tty (Which is good security).
What was needed:
- sshd installed on the linux box
- a folder with NTFS permissions on the 2000 server, for this example we'll call the folder powerdown
- plink on the server monitoring the UPS (http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html)
- A batch file called upsShutdown.bat
- A plink.exe option file called linuxHalt.txt
Once we have the ssh server setup, try logging into the local machine as a client. If this isn't successful, you most likely have to start sshd by running the following as root:
/etc/init.d/sshd
If you still cannot login locally, it's time to go google and read the various man pages.
After downloading plink to the powerdown folder, open up a command prompt on the 2000 server (start, run, cmd). change directory to plink and attempt a login by typing:
C:/powerdown> plink linuxCompName -l username -pw ******
You should have a shell after accepting the prompt to accept the key. Now we know we can securely log in to our linux box from windows, yay! Go celebrate with caffiene or have a play around with your shell.
In order to shell in as root, we have to make a change in our sshd configuration file, located at /etc/ssh/sshd_config. Change PermitRootLogin no, to PermitRootLogin yes.
Because we want the shutdown to be fully automated, we need to produce a batch file that gets called by the UPS monitor. On the 2k server, pen up notepad and enter the following:
plink linuxBoxName -batch -l root -pw plaintextPassword!? -m linuxHalt.txt
Save the file as "upsShutdown.bat" and make sure to include the quotes. Otherwise the file will save as a text file.
The -batch option disables any confirmation messages. The -l option indicates who will log in. The -pw switch signals password.
Wait a minute! A plaintext file with our coveted root login details!?!? OBVIOUSLY, this is not best practice! The scenario here involves a need to login as root because the authentication scheme is only handling LDAP domain user accounts, not the local accounts on the linux box. If I had more time to spend on this I would come up with something better. With that said, if somebody has a simpler solution please let me know! Unfortunately, I don't have the time to implement the most secure solution. Whatever changes, screwups, or problems you have in your network or on your computers because of this information is ONLY your fault. Follow these instructions at your own risk/peril/busted/hacked network!
Of course, the powerdown folder being on an NTFS network, tighten your permissions to only allow SYSTEM, and whatever user your UPS monitor software runs as with Read and Execute only.
Remember, we need to have a way to pass commands once the terminal is open. This is done through the -m option in the batch file. It will open linuxHalt.txt and execute the commands in that file on the remote linux server. Our text file simply has the one line in it:
shutdown -h now
Save your changes.
Now, make sure you have any files open on your linuxbox saved and execute the batch file. This will simulate the UPS calling the powerdown script. You should see your linuxbox go down immediately. You can change out the now parameter with -t 5 which would make it shutdown in 5 minutes.
Finally, we ensure that our UPS software (Belkin in this case) executes the batch file by entering the full path to the batch script (ie c:\powerdown\batchscript.bat). If you have a test environment test it. If not, wait until after hours (duh) and test it live.
Saturday, December 17, 2005
Using SSH tunnels with Windows
The other day I was finally able to get an ssh tunnel working that wrapped my remote desktop connection. Why bother with that you might ask, since the RDP is already secure with RC4? Well, because you can wrap your entire session in to ssh, ftp, telnet, or any other plain-text setup.
The trouble I initially had with the free SSH solution was configuring puTTy. Instead of choosing an arbritrary local port to run the remote desktop from, I was attempting to choose the default port of 3389. Duh, can't open two sides of a port to the same machine and expect to talk to another machine! So if you follow these two links, you too can have your very own SSH tunnel between two windows boxes:
http://pigtail.net/LRP/printsrv/cygwin-sshd.html
http://theillustratednetwork.mvps.org/Ssh/RemoteDesktopSSH.html
What you will be doing is installing Cygwin with ssh support on your host/server windows box (Or linux machine if that is what is desired, not for remote desktop then but vnc). Cygwin is a linux-like operating system that runs directly on top of windows. For the ssh client on the host machine, you can either setup another Cygwin setup, or use putty (at the above link). I say use puTTy unless you need to have ssh access to that machine. Putty is about 300KB, very small.
The trouble I initially had with the free SSH solution was configuring puTTy. Instead of choosing an arbritrary local port to run the remote desktop from, I was attempting to choose the default port of 3389. Duh, can't open two sides of a port to the same machine and expect to talk to another machine! So if you follow these two links, you too can have your very own SSH tunnel between two windows boxes:
http://pigtail.net/LRP/printsrv/cygwin-sshd.html
http://theillustratednetwork.mvps.org/Ssh/RemoteDesktopSSH.html
What you will be doing is installing Cygwin with ssh support on your host/server windows box (Or linux machine if that is what is desired, not for remote desktop then but vnc). Cygwin is a linux-like operating system that runs directly on top of windows. For the ssh client on the host machine, you can either setup another Cygwin setup, or use putty (at the above link). I say use puTTy unless you need to have ssh access to that machine. Putty is about 300KB, very small.
Wednesday, November 30, 2005
Mambo or Joomla
I have recently started to roll out these CMS's to schools who want an up and running website that is easy to maintain. The nice thing about them is just that. The crap thing is that there doesn't seem to be an easy way "yet" to prevent the general public from registering as a user with your site, or setting various security levels to groups in the front-end/general public.
Of all the various Content Management Systems I have tried out, Joomla is by far the easiest to setup, operate, and give a crisp professional feel to it. If you are going to set up an interactive website, use Joomla.
Aside from that though, I will link to the two sites that have been cookie-cut with these solutions once they are completed.
Apparently over the last couple of months there was a shake down between Mambo programmers. This caused the birth of the Mambo foundation and Joomla. From what I gather, Mambo has been (or is going to be) completely rewritten and is now set out to make money from their efforts (nothing wrong with that). The programmers devoted to the OS ideologies have continued on as Joomla.
**Update 31/1/06
Some of what I wrote here is innacurate now. Here is a link to a more recent rundown of the dispute between Mambos parent company Miro International, and the old developers who have forked in to Joomla.
I know that as an implementor for low-budget organizations, I will continue to stay with the original code which has worked well in the past.
**Update 31/1/06
Hmmmm.... without the same sort of funding going to joomla, I wonder if they'll be able to keep up in terms of their rebuilding efforts and whether or not an ecommerce aspect will come of it etc.
http://www.joomla.org
This information may also be of use on Joomla's site:
http://www.joomla.org/index.php?Itemid=44&option=com_faq&catid=7
Lastly Everyone! If you find this software useful, please don't forget to donate!
Of all the various Content Management Systems I have tried out, Joomla is by far the easiest to setup, operate, and give a crisp professional feel to it. If you are going to set up an interactive website, use Joomla.
Aside from that though, I will link to the two sites that have been cookie-cut with these solutions once they are completed.
Apparently over the last couple of months there was a shake down between Mambo programmers. This caused the birth of the Mambo foundation and Joomla. From what I gather, Mambo has been (or is going to be) completely rewritten and is now set out to make money from their efforts (nothing wrong with that). The programmers devoted to the OS ideologies have continued on as Joomla.
**Update 31/1/06
Some of what I wrote here is innacurate now. Here is a link to a more recent rundown of the dispute between Mambos parent company Miro International, and the old developers who have forked in to Joomla.
I know that as an implementor for low-budget organizations, I will continue to stay with the original code which has worked well in the past.
**Update 31/1/06
Hmmmm.... without the same sort of funding going to joomla, I wonder if they'll be able to keep up in terms of their rebuilding efforts and whether or not an ecommerce aspect will come of it etc.
http://www.joomla.org
This information may also be of use on Joomla's site:
http://www.joomla.org/index.php?Itemid=44&option=com_faq&catid=7
Lastly Everyone! If you find this software useful, please don't forget to donate!
MDaemon & World Client
One of the schools I support chose MDaemon as their email solution not long ago. It closely tries to emulate Exchange Server with a couple nice extras that one would normally pay BIG $/£ for with a "Connector" License. However, if your organization utilizes any pdas, blackberries etc, do yourself an your organization a favour and get Exchange. The amount of synchronization issues and incompatibilities with MDaemon eats into support hours very quickly!
BUT, if you do decide to use MDaemon or you are already blessed with the responsibility of administering one, you will no doubt come across product activation. It tends to work the same as MS's activation, where it is done automatically through the internet. When that doesn't work, a phonecall or email gets you the help you need to activate (You have a 30 day limit) within a day or two. Why am I going in to all this? If your NIC dies out, or you swap in a new NIC because you are upgrading, you will HAVE TO reactivate. What a pain yes. What I discovered in the process were some events that begun to appear in my DNS Logs afterwards.
If you are swapping this out on a domain controller, uninstall the old nic first. If you want to make sure the new NIC works first like me, you didn't do that. You ended up with two nics on the same box. This gets a bit tricky with DNS events 6701/2, bindings and static ip addresses, so do yourself a favour and uninstall the old nic first.
Once I had both NICS on the domain controller/mail server, I simply shut the server down and removed the old nic. The problem with that is the old nic settings stay hidden in the operating system hardware configuration. Any time you go to view your TCP/IP properties a pop-up dialgoue box alerts you to having two NICs with the same ip. If you didn't uninstall the old nic like you should have, you can do this by going to add/remove hardware, and ticking the "hidden" checkbox. You will now be able to remove your old invisible nic card, and avoid potential DNS issues and such.
BUT, if you do decide to use MDaemon or you are already blessed with the responsibility of administering one, you will no doubt come across product activation. It tends to work the same as MS's activation, where it is done automatically through the internet. When that doesn't work, a phonecall or email gets you the help you need to activate (You have a 30 day limit) within a day or two. Why am I going in to all this? If your NIC dies out, or you swap in a new NIC because you are upgrading, you will HAVE TO reactivate. What a pain yes. What I discovered in the process were some events that begun to appear in my DNS Logs afterwards.
If you are swapping this out on a domain controller, uninstall the old nic first. If you want to make sure the new NIC works first like me, you didn't do that. You ended up with two nics on the same box. This gets a bit tricky with DNS events 6701/2, bindings and static ip addresses, so do yourself a favour and uninstall the old nic first.
Once I had both NICS on the domain controller/mail server, I simply shut the server down and removed the old nic. The problem with that is the old nic settings stay hidden in the operating system hardware configuration. Any time you go to view your TCP/IP properties a pop-up dialgoue box alerts you to having two NICs with the same ip. If you didn't uninstall the old nic like you should have, you can do this by going to add/remove hardware, and ticking the "hidden" checkbox. You will now be able to remove your old invisible nic card, and avoid potential DNS issues and such.
Tuesday, November 08, 2005
Linux the lonely island
At one of the schools I support we have a *nix box running our filtering proxy as well as an internal website. Having an AD setup, I realized a bit of work needs to be done in order to get people to login to the website with their AD account. I came across a link on how to do just this step-by-step with Yast but not Mandrake/driva or any other flav. It involves using either samba with winbind or the kerberos client. I have no experience yet with the kerberos client so I think I'll have to try that out. The most recent samba rpm I can get for mandrake just doesn't want to do anymore than share the top parent folder in network neighborhood. Even worse, it will show the files, but not share them!
About this time last year I was running a complete linux domain with xp clients and roaming profiles. Once set up it all worked a charm BUT it all came back to TCO and SSO (single-sign-on). Some hardcore peeps out there will put it down to not doing it right or being out of my depth. Silliness. Always do what you can with what you have. There is no way to survive as a tech in schools/non-profits without accepting the politics from above that enforcea hodge-podge approach because of their impatience. The time it took to setup network apps and a single-point of admin for antivirus took weeks instead of several hours. I found afterwards that when a question or problem arose, I was no longer racking my brain for what the correct man page was to make a tweak to samba or wine etc. What our linux box has done awesome though is our filtering and website (squid w/dans guardian and apache) . If you attempt to set something like that up with limited experience, give this distro karoshi a try. It's aimed towards schools but approaches networks with a corporate lockdown perspective. I still use peices of that distro in various network setups and it runs great.
About this time last year I was running a complete linux domain with xp clients and roaming profiles. Once set up it all worked a charm BUT it all came back to TCO and SSO (single-sign-on). Some hardcore peeps out there will put it down to not doing it right or being out of my depth. Silliness. Always do what you can with what you have. There is no way to survive as a tech in schools/non-profits without accepting the politics from above that enforcea hodge-podge approach because of their impatience. The time it took to setup network apps and a single-point of admin for antivirus took weeks instead of several hours. I found afterwards that when a question or problem arose, I was no longer racking my brain for what the correct man page was to make a tweak to samba or wine etc. What our linux box has done awesome though is our filtering and website (squid w/dans guardian and apache) . If you attempt to set something like that up with limited experience, give this distro karoshi a try. It's aimed towards schools but approaches networks with a corporate lockdown perspective. I still use peices of that distro in various network setups and it runs great.
Tuesday, October 18, 2005
Active Directory and Backups II
What a complete waste it became. The active directory was so thrashed that there was no way to recover it and make things useful.
That said, I am gong to put backups and secondary domain controllers on hold for a while until I've done some more self-educating on the subject.
The new network is in place however, and humming along quite well. There was a small glitch in getting the 98 machines printing again (had to add their computer name and users to the AD store) but that is now resolved. I ended up reinstalling the 98dsclient but I'm not sure if that was necessary. They don't login to the network.
One of the schools I'm supporting I hope will decide it's worth their money to invest in a backup solution aside from ntbackup. Basically with external drives on the cheap these days and the school's use of multimedia, using ntbackup to backup to DVDs each week is a bit tedious. More on that later and the solution that gets put in. I don't want to turn this blog into a bitch session (yet anyways).
That said, I am gong to put backups and secondary domain controllers on hold for a while until I've done some more self-educating on the subject.
The new network is in place however, and humming along quite well. There was a small glitch in getting the 98 machines printing again (had to add their computer name and users to the AD store) but that is now resolved. I ended up reinstalling the 98dsclient but I'm not sure if that was necessary. They don't login to the network.
One of the schools I'm supporting I hope will decide it's worth their money to invest in a backup solution aside from ntbackup. Basically with external drives on the cheap these days and the school's use of multimedia, using ntbackup to backup to DVDs each week is a bit tedious. More on that later and the solution that gets put in. I don't want to turn this blog into a bitch session (yet anyways).
MDaemon and Palm Synching with Outlook
Over the past six months one of the networks I support utilizes MDaemon as a groupware solution which works quite well. For our small organization it scales quite fine (just under 100 users). The only problem has been synching between a palm Tungsten T2 and Outlook.
At the time of this writing, the latest software for all is being used. I can't recall the palm software. MDaemon 8.1.3 and Connector 2.0.4 both of which run pretty smooth. There is one central shared calendar that is used as a "diary" for the entire school day. These events/entries are set to be editable by the owner, and two office administrators. Somewhere along the line, the synching of the palm seemed to be causing the wiping of any entered data. The first thought may be "Ah HA!, Your configuration is set to have the palm over-ride any settings!" but this is not the case. In fact, there doesn't seem to be any case for it. Very difficult to explain. The only thought I have is that the particular laptop these synchs were coming from ended up being on a computer in an old domain, and the mail server is in a new domain (with the same name).
Personally, I don't see how it could be anymore than operator error as there is nothing in the logs to suggest otherwise.
Subscribe to:
Posts (Atom)