Re: bootable failed sw raid 1 with F9

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Sander Hoentjen wrote:
Hi list,

For the first time in my life i tried to install Fedora with sw raid.
See below what went wrong.

Here is what I did:
Start with 2 empty 500GB sata disks.
Make sure nvraid is turned off in my BIOS.
Start an F9 install, creating 2 sw RAID partitions: md0 and md1.
md0 is 100MB and has an ext3 /boot.
md1 has the rest of the space and is LVM.
In the lvm I have created the rest of my partitions.

Install went great, after reboot my system booted fine, so far so good.
I then shutdown my system, pulled out a disk and started again. I got
the message "GRUB Hard Disk Error". So I shut down, plugged the disk
back in, pulled out the other one and started again. This time I was met
by a GRUB shell, no boot logo, no idea what to do (no menu).
Shutdown again, replug the disk, start again, get on IRC, type:
grub
root (hd0,0)
setup (hd0)
root (hd1,0)
setup (hd1)

After that: reboot minus 1 disk. I can see grub, with logo and boot
options. It starts ok, i even get rhgb for a second and then I see:
"fsck.ext3: Invalid argument while trying to open /dev/md0"
I can go into a maintenance shell and when I do cat /proc/mdstat is see:
md0 : inactive sda1[0](s)

"mdadm --assemble /dev/md0" turns it active again, but well I have no
idea how I can continue normal boot, if it is even possible.

So this is my story, now my questions:
- Did I do anything wrong? I performed the installation twice, with both
times the same result.
- Is this a bug somewhere? Do other people get the same or better
results?
- Is there anything I can do to fix this?

Thanks for reading this far,

Sander


I have already experienced this problem and raised a report on Redhat bugzilla (no. 450722) although there has been no response to it so far. I spent some time pinning the problem down to Fedora 9, (it is OK on Fedora 8 plus updates).

I have gone back to Fedora 8 for the particular machine I required Raid1 for.

However, there a sort of work around.

Firstly, the machine boots if both discs are present. If the partitions on the failed disc are setfaulty, then the machine will reboot successfully with the failed disc removed, (providing you have originally 'grub'd' both discs).

I have persuaded myself that this is probably OK since the reason for rebooting with only one disc is when the other is already partly marked faulty.

I do consider it to be a serious fault since it hits you just when the reason for having Raid1 discs becomes justified when one fails.

Hope this helps

John Whitley

--
fedora-list mailing list
fedora-list@xxxxxxxxxx
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list

[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux