Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]创建裸金属提示创建失败 #19800

Open
dengni-10000 opened this issue Mar 26, 2024 · 0 comments
Open

[BUG]创建裸金属提示创建失败 #19800

dengni-10000 opened this issue Mar 26, 2024 · 0 comments
Labels

Comments

@dengni-10000
Copy link

release/3.11
物理机添加成功后,创建裸金属部署失败,baremetal-agent日志如下,
[debug 2024-03-26 08:31:36 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): whoami
[info 2024-03-26 08:31:36 tasks.(*SBaremetalServerBaseDeployTask).OnPXEBoot(basedeploy.go:97)] BaremetalServerBaseDeployTask called on stage pxeboot, args:
[debug 2024-03-26 08:31:36 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): test -d /sys/firmware/efi && echo is || echo not
[info 2024-03-26 08:31:36 megactl.newRaid(driver.go:77)] Not use perccli, use legacy megaraid driver
[debug 2024-03-26 08:31:36 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/adaptec/arcconf LIST
[warning 2024-03-26 08:31:37 detect_storages.DetectStorageInfo(detect_storages.go:58)] Raid driver AdaptecRaid ParsePhyDevs: Not found adaptec raid controller
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /sbin/lsmod
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/hp/hpssacli/bld/hpssacli controller all show
[warning 2024-03-26 08:31:37 detect_storages.DetectStorageInfo(detect_storages.go:58)] Raid driver HPSARaid ParsePhyDevs: "/opt/hp/hpssacli/bld/hpssacli controller all show" error: Process exited with status 1, cmd error:
Error: No controllers detected. Possible causes:
- The driver for the installed controller(s) is not loaded.
- On LINUX, the scsi_generic (sg) driver module is not loaded.
See the README file for more details.

[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /sbin/lsmod
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDList -aALL
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -CfgDsply -a0
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -adpgetpciinfo -a0
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/ | grep host
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/host2/ | grep target | head -n 1
[debug 2024-03-26 08:31:37 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/mvcli/mvcli info -o hba
[warning 2024-03-26 08:31:40 detect_storages.DetectStorageInfo(detect_storages.go:58)] Raid driver MarvelRaid ParsePhyDevs: Empty adapters
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): lspci -k | grep mpt2sas
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): lspci -k | grep mpt3sas
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/lsi/sas3ircu LIST
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/lsi/sas3ircu 0 DISPLAY
[warning 2024-03-26 08:31:40 detect_storages.DetectStorageInfo(detect_storages.go:58)] Raid driver Mpt2SAS ParsePhyDevs: No raid support
[info 2024-03-26 08:31:40 detect_storages.DetectStorageInfo(detect_storages.go:73)] Get Raid drivers: [MegaRaid], collecting disks info ...
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --pcie
[debug 2024-03-26 08:31:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[debug 2024-03-26 08:31:45 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:31:45Z
[debug 2024-03-26 08:31:50 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[debug 2024-03-26 08:31:55 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:31:55Z
[debug 2024-03-26 08:32:00 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[debug 2024-03-26 08:32:05 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:32:05Z
[debug 2024-03-26 08:32:10 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[debug 2024-03-26 08:32:15 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:32:15Z
[debug 2024-03-26 08:32:20 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[warning 2024-03-26 08:32:22 appsrv.do_worker_watchdog(workers_watchdog.go:64)] WorkerManager BaremetalTaskWorkerManager has been busy for 2 cycles...
[debug 2024-03-26 08:32:25 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:32:25Z
[debug 2024-03-26 08:32:30 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/lsdisk --nonraid
[debug 2024-03-26 08:32:35 cronman.(*SCronJob).runJobInWorker(cronman.go:453)] Cron job: BaremetalCronJobs started, startTime: 2024-03-26T08:32:35Z
[info 2024-03-26 08:32:40 detect_storages.DetectStorageInfo(detect_storages.go:95)] RaidDiskInfo: [{"adapter":0,"block":512,"driver":"MegaRaid","enclousure":251,"index":0,"max_strip_size":64,"min_strip_size":64,"model":"S6KNNE0T517179 SAMSUNG MZ7L3960HCJR-00B7C JXTC304Q","rotate":false,"sector":1874329600,"size":915200,"slot":0,"status":"unconfigured_good"},{"adapter":0,"block":512,"driver":"MegaRaid","enclousure":251,"index":0,"max_strip_size":64,"min_strip_size":64,"model":"S6KNNE0T517186 SAMSUNG MZ7L3960HCJR-00B7C JXTC304Q","rotate":false,"sector":1874329600,"size":915200,"slot":1,"status":"online"}], NonRaidSCSIDiskInfo: [], PCIEDiskInfo: []
[error 2024-03-26 08:32:40 baremetal.(*SBaremetalServer).DoDiskConfig(manager.go:2625)] ===layouts: [
{
"conf":
{
"adapter": 0,
"conf": "raid0",
"count": 2,
"driver": "MegaRaid",
"range":
[
0,
1
],
"type": "ssd"
},
"disks":
[
{
"adapter": 0,
"block": 512,
"driver": "MegaRaid",
"enclousure": 251,
"index": 0,
"max_strip_size": 64,
"min_strip_size": 64,
"model": "S6KNNE0T517179 SAMSUNG MZ7L3960HCJR-00B7C JXTC304Q",
"rotate": false,
"sector": 1874329600,
"size": 915200,
"slot": 0,
"status": "unconfigured_good"
},
{
"adapter": 0,
"block": 512,
"driver": "MegaRaid",
"enclousure": 251,
"index": 1,
"max_strip_size": 64,
"min_strip_size": 64,
"model": "S6KNNE0T517186 SAMSUNG MZ7L3960HCJR-00B7C JXTC304Q",
"rotate": false,
"sector": 1874329600,
"size": 915200,
"slot": 1,
"status": "online"
}
],
"size": 1830400
}
]
[error 2024-03-26 08:32:40 baremetal.(*SBaremetalServer).DoDiskConfig(manager.go:2627)] ===diskConfs: [
{
"adapter": 0,
"configs":
[
{
"adapter": 0,
"conf": "raid0",
"count": 2,
"driver": "MegaRaid",
"range":
[
0,
1
],
"type": "ssd"
}
],
"driver": "MegaRaid"
}
]
[info 2024-03-26 08:32:40 megactl.newRaid(driver.go:77)] Not use perccli, use legacy megaraid driver
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /sbin/lsmod
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDList -aALL
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -CfgDsply -a0
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -adpgetpciinfo -a0
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/ | grep host
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/host2/ | grep target | head -n 1
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:0]' -Force -a0
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0
[error 2024-03-26 08:32:40 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:959)] megacliClearAllJBODDisks error: [PDMakeGood megacli cmd /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv ' egaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:0]' -Force -a0" error: Process exited with status 1, cmd error:
Adapter: 0: Failed to change PD state at EnclId-251 SlotId-0.

Exit Code: 0x01
, PDMakeGood megacli cmd /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0: "/opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0" error: Process exite
Adapter: 0: Failed to change PD state at EnclId-251 SlotId-1.

Exit Code: 0x01
]
[info 2024-03-26 08:32:40 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:960)] try storcliClearJBODDisks
[debug 2024-03-26 08:32:40 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /call show | grep -iE '^(Controller|Product Name|Serial Number|Bus Number|Device Number|Function Number)\s='
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /call show | grep -iE '^(Controller|Product Name|Serial Number|Bus Number|Device Number|Function Number)\s='
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:962)] storcliClearJBODDisks error: [Set PD good storcli cmd /opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force: "/opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force" error: Process exited with status 255, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = Set Drive Good Failed.

Detailed Status :


Drive Status ErrCd ErrMsg

/c0/e251/s0 Failure 255 Operation not allowed.

, Set PD good storcli cmd /opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force: "/opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force" error: Process exited with status 255, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = Set Drive Good Failed.

Detailed Status :


Drive Status ErrCd ErrMsg

/c0/e251/s1 Failure 255 Operation not allowed.

]
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).megacliEnableJBOD(megactl.go:877)] enable jbod true fail: "/opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0" error: Process exit
Adapter 0: Failed to Set Adapter Properties.

Exit Code: 0x01

[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -0 -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).megacliEnableJBOD(megactl.go:877)] enable jbod true fail: "/opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0" error: Process exit
Adapter 0: Failed to Set Adapter Properties.

Exit Code: 0x01

[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -0 -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -LDInfo -Lall -a0
[info 2024-03-26 08:32:41 megactl.newRaid(driver.go:77)] Not use perccli, use legacy megaraid driver
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /sbin/lsmod
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDList -aALL
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -CfgDsply -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -adpgetpciinfo -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/ | grep host
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): ls /sys/bus/pci/devices/0000:31:00.0/host2/ | grep target | head -n 1
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -CfgForeign -Clear -aALL
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:0]' -Force -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:959)] megacliClearAllJBODDisks error: [PDMakeGood megacli cmd /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv ' egaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:0]' -Force -a0" error: Process exited with status 1, cmd error:
Adapter: 0: Failed to change PD state at EnclId-251 SlotId-0.

Exit Code: 0x01
, PDMakeGood megacli cmd /opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0: "/opt/MegaRAID/MegaCli/MegaCli64 -PDMakeGood -PhysDrv '[251:1]' -Force -a0" error: Process exite
Adapter: 0: Failed to change PD state at EnclId-251 SlotId-1.

Exit Code: 0x01
]
[info 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:960)] try storcliClearJBODDisks
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /call show | grep -iE '^(Controller|Product Name|Serial Number|Bus Number|Device Number|Function Number)\s='
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /call show | grep -iE '^(Controller|Product Name|Serial Number|Bus Number|Device Number|Function Number)\s='
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).clearJBODDisks(megactl.go:962)] storcliClearJBODDisks error: [Set PD good storcli cmd /opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force: "/opt/MegaRAID/storcli/storcli64 /c0/e251/s0 set good force" error: Process exited with status 255, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = Set Drive Good Failed.

Detailed Status :


Drive Status ErrCd ErrMsg

/c0/e251/s0 Failure 255 Operation not allowed.

, Set PD good storcli cmd /opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force: "/opt/MegaRAID/storcli/storcli64 /c0/e251/s1 set good force" error: Process exited with status 255, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = Set Drive Good Failed.

Detailed Status :


Drive Status ErrCd ErrMsg

/c0/e251/s1 Failure 255 Operation not allowed.

]
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).megacliEnableJBOD(megactl.go:877)] enable jbod true fail: "/opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0" error: Process exit
Adapter 0: Failed to Set Adapter Properties.

Exit Code: 0x01

[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -0 -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0
[error 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).megacliEnableJBOD(megactl.go:877)] enable jbod true fail: "/opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -1 -a0" error: Process exit
Adapter 0: Failed to Set Adapter Properties.

Exit Code: 0x01

[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -AdpSetProp -EnableJBOD -0 -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -LDInfo -Lall -a0
[info 2024-03-26 08:32:41 megactl.(*MegaRaidAdaptor).megacliBuildRaid(megactl.go:717)] _megacliBuildRaid command: /opt/MegaRAID/MegaCli/MegaCli64 -CfgLdAdd -r0 [251:0,251:1] -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/MegaCli/MegaCli64 -CfgLdAdd -r0 [251:0,251:1] -a0
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /call show | grep -iE '^(Controller|Product Name|Serial Number|Bus Number|Device Number|Function Number)\s='
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /opt/MegaRAID/storcli/storcli64 /c0 add vd type=r0 drives=251:0,251:1
[error 2024-03-26 08:32:41 tasks.(*SBaremetalServerCreateTask).onError(create.go:104)] Create server error: Build MegaRaid raid failed: Driver MegaRaid, adapter 0 build raid: Build raid raid0: ["/ Add -r0 [251:0,251:1] -a0" error: Process exited with status 11, cmd error:
Mix of configured and unconfigured drives are not possible.

Exit Code: 0x0b
, "/opt/MegaRAID/storcli/storcli64 /c0 add vd type=r0 drives=251:0,251:1" error: Process exited with status 11, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = resources already in use

]
[debug 2024-03-26 08:32:41 ssh.(*Client).run(ssh.go:183)] Run command(root@192.168.5.3): /lib/mos/partdestroy.sh
[debug 2024-03-26 08:32:42 ipmitool.(*LanPlusIPMI).ExecuteCommand(ipmitool.go:134)]LanPlusIPMI: execute command: ipmitool -I lanplus -H 192.168.4.104 -p 623 -U Administrator -P Admin@9000 chassis power status
[debug 2024-03-26 08:32:42 procutils.(*Command).Output(procutils.go:95)] Exec command: ipmitool [-I lanplus -H 192.168.4.104 -p 623 -U Administrator -P Admin@9000 chassis power status]
[info 2024-03-26 08:32:42 baremetal.(*SBaremetalInstance).SyncStatus(manager.go:857)] Update baremetal 5ecf2c4b-87d6-4577-8896-d34d4617ed4f to status running
[error 2024-03-26 08:32:42 tasks.executeTask(worker.go:89)] Execute task BaremetalServerCreateTask error: Do deploy: Build MegaRaid raid failed: Driver MegaRaid, adapter 0 build raid: Build raid r 64 -CfgLdAdd -r0 [251:0,251:1] -a0" error: Process exited with status 11, cmd error:
Mix of configured and unconfigured drives are not possible.

Exit Code: 0x0b
, "/opt/MegaRAID/storcli/storcli64 /c0 add vd type=r0 drives=251:0,251:1" error: Process exited with status 11, cmd error: CLI Version = 007.2612.0000.0000 June 13, 2023
Operating system = Linux 6.1.0-13-amd64
Controller = 0
Status = Failure
Description = resources already in use

]

@dengni-10000 dengni-10000 added the bug Something isn't working label Mar 26, 2024
@github-actions github-actions bot added the stale label Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant