Apr 24 22:27:31.105076 ip-10-0-133-161 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:31.105085 ip-10-0-133-161 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:31.105092 ip-10-0-133-161 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:31.105334 ip-10-0-133-161 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:41.238536 ip-10-0-133-161 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:41.238551 ip-10-0-133-161 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2dfa5cdfd47b47f5ac104fc6b1bc9a2b -- Apr 24 22:29:50.142325 ip-10-0-133-161 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:50.629087 ip-10-0-133-161 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:50.629087 ip-10-0-133-161 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:50.629087 ip-10-0-133-161 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:50.629087 ip-10-0-133-161 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:50.629087 ip-10-0-133-161 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:50.630136 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.630044 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:50.637469 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637447 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:50.637469 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637464 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:50.637469 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637471 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:50.637469 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637475 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637481 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637487 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637491 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637495 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637499 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637503 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637506 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637511 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637514 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637519 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637523 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637527 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637531 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637535 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637540 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637544 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637548 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637553 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:50.637718 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637557 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637562 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637572 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637576 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637580 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637584 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637588 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637592 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637597 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637601 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637606 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637610 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637614 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637619 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637623 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637629 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637634 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637638 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637642 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637646 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637651 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:50.638357 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637655 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637659 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637663 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637667 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637672 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637676 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637680 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637684 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637688 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637692 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637697 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637702 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637707 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637712 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637716 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637719 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637723 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637728 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637732 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:50.638871 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637736 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637740 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637744 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637748 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637755 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637759 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637764 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637769 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637773 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637777 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637782 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637787 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637791 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637795 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637800 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637804 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637808 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637812 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637818 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:50.639398 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637825 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637830 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637835 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637839 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.637843 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638415 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638427 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638431 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638435 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638437 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638440 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638443 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638446 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638449 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638451 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638454 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638457 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638459 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638462 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638465 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:50.639863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638467 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638470 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638472 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638475 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638478 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638480 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638483 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638485 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638487 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638490 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638492 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638495 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638497 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638500 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638502 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638505 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638507 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638510 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638516 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638519 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:50.640364 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638522 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638524 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638527 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638529 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638532 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638534 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638537 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638539 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638542 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638544 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638546 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638549 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638551 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638554 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638556 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638559 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638562 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638564 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638567 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638569 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:50.640869 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638572 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638575 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638578 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638580 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638583 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638585 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638588 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638590 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638593 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638595 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638601 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638604 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638607 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638609 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638612 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638615 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638619 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638623 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638626 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638630 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:50.641370 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638634 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638637 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638639 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638642 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638644 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638647 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638650 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638669 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638673 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638678 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.638681 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639585 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639595 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639602 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639607 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639611 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639614 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639619 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639624 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639628 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:50.641857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639631 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639634 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639640 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639644 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639648 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639651 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639654 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639657 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639660 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639663 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639667 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639669 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639673 2574 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639676 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639679 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639683 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639687 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639690 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639693 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639696 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639699 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639702 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639705 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639709 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639713 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:50.642354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639716 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639719 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639722 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639725 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639728 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639733 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639736 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639739 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639742 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639747 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639751 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639754 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639758 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639761 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639764 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639767 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639770 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639773 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639776 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639779 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639782 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639785 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639788 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639792 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639795 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639798 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:50.642978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639801 2574 flags.go:64] FLAG: --help="false" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639804 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639807 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639810 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639813 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639816 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639819 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639822 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639825 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639827 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639830 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639834 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639837 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639840 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639843 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639848 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639851 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639854 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639857 2574 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639863 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639867 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639870 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639876 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:50.643607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639879 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639882 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639885 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639888 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639891 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639894 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639897 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639902 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639905 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639909 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639912 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639915 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639918 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639921 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639924 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639927 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639929 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639937 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639941 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639944 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639948 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639951 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639969 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639973 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:50.644187 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639976 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639980 2574 flags.go:64] FLAG: --port="10250" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639983 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639986 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0845ab6472c3de3b0" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639989 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639993 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639996 2574 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.639999 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640002 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640011 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640014 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640017 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640020 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640024 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640027 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640030 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640033 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640036 2574 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640039 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640042 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640045 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640048 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640051 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640054 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640057 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640060 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:50.644816 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640063 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640066 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640070 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640074 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640077 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640080 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640082 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640090 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640093 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640096 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640100 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640103 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640106 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640109 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640112 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640115 2574 flags.go:64] FLAG: --v="2" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640119 2574 flags.go:64] FLAG: --version="false" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640124 2574 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640128 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.640131 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640223 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640226 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640229 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640232 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:50.645456 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640234 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640237 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640240 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640243 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640245 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640248 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640251 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640254 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640256 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640259 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640262 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640265 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640267 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640270 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640272 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640276 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640279 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640281 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640284 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640286 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:50.646050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640289 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640292 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640294 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640297 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640299 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640302 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640305 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640307 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640310 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640313 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640315 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640318 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640320 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640323 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640325 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640328 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640330 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640333 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640337 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:50.646562 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640341 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640343 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640346 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640350 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640352 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640355 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640357 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640360 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640364 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640367 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640371 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640374 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640377 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640381 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640384 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640387 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640391 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640393 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640396 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:50.647046 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640399 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640401 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640404 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640406 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640409 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640412 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640414 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640416 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640419 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640421 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640424 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640426 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640429 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640431 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640434 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640437 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640440 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640442 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640445 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640447 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:50.647532 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640450 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640453 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640455 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.640458 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.641498 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.647594 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.647700 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647748 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647754 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647758 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647761 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647764 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647767 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647769 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647772 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647775 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:50.648050 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647778 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647782 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647784 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647787 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647789 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647792 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647795 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647797 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647800 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647803 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647805 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647808 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647811 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647813 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647816 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647818 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647822 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647824 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647827 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647829 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:50.648445 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647832 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647834 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647837 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647840 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647843 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647846 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647849 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647851 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647854 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647856 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647859 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647862 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647864 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647867 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647869 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647872 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647874 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647877 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647880 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647882 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:50.648919 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647885 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647887 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647890 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647892 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647895 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647897 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647899 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647902 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647904 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647907 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647909 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647912 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647915 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647917 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647921 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647925 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647929 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647932 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647935 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:50.649487 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647938 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647940 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647943 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647945 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647948 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647950 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647953 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647971 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647976 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647979 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647982 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647985 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647988 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647991 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647993 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647996 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.647999 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:50.650032 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648002 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.648007 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648103 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648107 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648110 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648113 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648116 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648119 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648122 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648125 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648128 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648132 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648134 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648137 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648139 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:50.650451 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648142 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648145 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648148 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648150 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648153 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648155 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648158 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648160 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648163 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648165 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648168 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648170 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648173 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648175 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648178 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648182 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648185 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648188 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648191 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:50.650821 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648194 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648196 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648199 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648202 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648204 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648206 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648209 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648211 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648214 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648217 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648220 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648222 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648224 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648227 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648230 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648232 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648235 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648237 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648240 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:50.651304 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648243 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648245 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648248 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648250 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648253 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648255 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648258 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648260 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648263 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648265 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648268 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648271 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648273 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648276 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648278 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648281 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648284 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648288 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648291 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648294 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:50.651753 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648296 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648299 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648302 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648304 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648307 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648310 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648312 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648315 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648318 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648320 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648323 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648325 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648327 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648330 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:50.648332 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.648337 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:50.652249 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.648982 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:50.652642 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.652407 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:50.653500 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.653488 2574 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:50.653601 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.653585 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:50.653639 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.653630 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:50.680910 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.680892 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:50.685397 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.685369 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:50.702773 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.702749 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:50.708471 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.708457 2574 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:50.709516 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.709500 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:50.712654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.712637 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:50.713534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.713512 2574 fs.go:135] Filesystem UUIDs: map[728b96f2-050a-4c86-91ca-a1758d3a4eef:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d2e96a9c-2e05-430c-9b70-5f4f6db5535b:/dev/nvme0n1p3] Apr 24 22:29:50.713585 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.713535 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:50.719227 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.719117 2574 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:50.717215208 +0000 UTC m=+0.440807780 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099593 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28deb6e09b03700852926ca9c8e6bc SystemUUID:ec28deb6-e09b-0370-0852-926ca9c8e6bc BootID:2dfa5cdf-d47b-47f5-ac10-4fc6b1bc9a2b Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cc:da:df:0b:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cc:da:df:0b:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:94:5e:c1:36:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:50.719227 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.719225 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:50.719353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.719341 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:50.720907 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.720883 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:50.721063 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.720909 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-161.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:50.721107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.721072 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:50.721107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.721080 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:50.721107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.721093 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:50.721872 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.721863 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:50.722596 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.722587 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:50.722693 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.722685 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:50.724828 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.724817 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:50.724867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.724838 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:50.724867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.724849 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:50.724867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.724859 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:50.724867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.724867 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:50.726319 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.726304 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:50.726319 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.726321 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:50.730225 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.730208 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:50.731469 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.731455 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:50.733287 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733276 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:50.733326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733296 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:50.733326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733307 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:50.733326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733315 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:50.733326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733322 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733329 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733335 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733340 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733347 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733353 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733362 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:50.733483 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.733370 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:50.734151 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.734133 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:50.734151 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.734151 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:50.738099 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738071 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7h5mj" Apr 24 22:29:50.738358 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738344 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:50.738429 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738397 2574 server.go:1295] "Started kubelet" Apr 24 22:29:50.738565 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738498 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:50.738614 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738589 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:50.738800 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.738610 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-161.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:50.738908 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.738883 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:50.739460 ip-10-0-133-161 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:50.742892 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.742860 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:50.742995 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.742924 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-161.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:50.743050 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.743004 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:50.743174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.743161 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7h5mj" Apr 24 22:29:50.744569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.744541 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:50.746403 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.745484 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-161.ec2.internal.18a96b8f3209a5db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-161.ec2.internal,UID:ip-10-0-133-161.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-161.ec2.internal,},FirstTimestamp:2026-04-24 22:29:50.738359771 +0000 UTC m=+0.461952336,LastTimestamp:2026-04-24 22:29:50.738359771 +0000 UTC m=+0.461952336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-161.ec2.internal,}" Apr 24 22:29:50.748096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748080 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:50.748183 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748093 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748728 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748733 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748757 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748845 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.748856 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:50.748953 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.748916 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:50.749287 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749088 2574 factory.go:55] Registering systemd factory Apr 24 22:29:50.749287 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749110 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:50.749502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749471 2574 factory.go:153] Registering CRI-O factory Apr 24 22:29:50.749502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749494 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:50.749644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749542 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:50.749644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749561 2574 factory.go:103] Registering Raw factory Apr 24 22:29:50.749644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.749576 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:50.750020 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.750004 2574 manager.go:319] Starting recovery of all containers Apr 24 22:29:50.751089 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.751066 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:50.752817 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.752796 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:50.753331 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.753312 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.759436 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.759411 2574 manager.go:324] Recovery completed Apr 24 22:29:50.764307 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.764293 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.768707 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.768688 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.768773 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.768718 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.768773 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.768728 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.769199 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.769185 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:50.769199 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.769196 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:50.769285 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.769213 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:50.772139 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.772125 2574 policy_none.go:49] "None policy: Start" Apr 24 22:29:50.772219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.772144 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:50.772219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.772157 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807512 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.807546 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807558 2574 server.go:85] "Starting device plugin registration server" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807764 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807773 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807868 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807944 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.807955 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.808643 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:50.823472 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.808675 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:50.840389 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.840361 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:50.841590 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.841575 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:50.841675 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.841602 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:50.841675 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.841622 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:50.841675 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.841632 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:50.841811 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.841713 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:50.845799 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.845781 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:50.908082 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.908002 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.909668 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.909652 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.909743 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.909682 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.909743 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.909692 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.909743 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.909717 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.918214 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.918195 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.918267 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.918223 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-161.ec2.internal\": node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:50.931155 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.931137 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:50.942148 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.942129 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal"] Apr 24 22:29:50.942196 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.942187 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.943872 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.943859 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.943939 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.943884 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.943939 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.943894 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.946202 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.946190 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.946861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.946848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.946900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.946880 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.949117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949098 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.949211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949143 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.949211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949104 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.949211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949156 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.949211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949191 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.949211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949202 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.949929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.949987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.949987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.949971 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.951343 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.951328 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.951413 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.951351 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:50.952085 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.952064 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:50.952182 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.952094 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:50.952182 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:50.952104 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:50.970558 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.970538 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 24 22:29:50.973975 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:50.973947 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.031385 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.031341 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.050096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.050096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.050257 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.050257 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.050257 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.050257 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.050215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.131600 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.131558 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.232230 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.232168 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.272673 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.272626 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.276231 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.276210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:51.332700 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.332662 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.433179 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.433140 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.533706 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.533632 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.634113 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.634079 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.653413 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.653391 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:51.653562 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.653545 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:51.653612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.653558 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:51.735050 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.735018 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.745146 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.745106 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:50 +0000 UTC" deadline="2027-12-20 19:53:07.603195232 +0000 UTC" Apr 24 22:29:51.745262 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.745150 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14517h23m15.858056632s" Apr 24 22:29:51.748211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.748194 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:51.765999 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.765980 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:51.783940 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.783895 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5jmfv" Apr 24 22:29:51.791681 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.791665 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5jmfv" Apr 24 22:29:51.811885 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:51.811644 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c8bd7a095056144ad8091ca3c68103.slice/crio-851a95d384a31202451632172c814c7274349e92324fc5f65081b1c6773c939a WatchSource:0}: Error finding container 851a95d384a31202451632172c814c7274349e92324fc5f65081b1c6773c939a: Status 404 returned error can't find the container with id 851a95d384a31202451632172c814c7274349e92324fc5f65081b1c6773c939a Apr 24 22:29:51.812163 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:51.812142 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17556b0047f1a6a15f4b7d5854560826.slice/crio-d5881207aea7c9f7226c0ce8d7761189e80af16bbd4edf5e693ef83506d746ef WatchSource:0}: Error finding container d5881207aea7c9f7226c0ce8d7761189e80af16bbd4edf5e693ef83506d746ef: Status 404 returned error can't find the container with id d5881207aea7c9f7226c0ce8d7761189e80af16bbd4edf5e693ef83506d746ef Apr 24 22:29:51.816506 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.816491 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:51.835598 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.835579 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.844354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.844313 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" event={"ID":"71c8bd7a095056144ad8091ca3c68103","Type":"ContainerStarted","Data":"851a95d384a31202451632172c814c7274349e92324fc5f65081b1c6773c939a"} Apr 24 22:29:51.845176 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.845151 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerStarted","Data":"d5881207aea7c9f7226c0ce8d7761189e80af16bbd4edf5e693ef83506d746ef"} Apr 24 22:29:51.936690 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:51.936644 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 24 22:29:51.984771 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:51.984753 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:52.048618 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.048562 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 24 22:29:52.064465 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.064444 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:52.066335 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.066323 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 24 22:29:52.078887 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.078868 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:52.087332 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.087315 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:52.628322 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.628295 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:52.673461 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.673434 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:52.726054 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.726024 2574 apiserver.go:52] "Watching apiserver" Apr 24 22:29:52.733559 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.733531 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:52.735934 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.735903 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m","openshift-cluster-node-tuning-operator/tuned-klbmp","openshift-dns/node-resolver-knznw","openshift-image-registry/node-ca-xx2md","openshift-multus/multus-additional-cni-plugins-7rncr","openshift-multus/multus-ppjh8","openshift-ovn-kubernetes/ovnkube-node-b4zlm","kube-system/konnectivity-agent-l24ds","kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal","openshift-multus/network-metrics-daemon-8ztg8","openshift-network-diagnostics/network-check-target-rscxl","openshift-network-operator/iptables-alerter-xj45v"] Apr 24 22:29:52.740771 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.740748 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.742841 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.742818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.744058 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.744038 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.744125 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.744094 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h949l\"" Apr 24 22:29:52.745574 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.745557 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:52.745670 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.745577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.745730 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.745722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.746024 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.745935 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:52.746113 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.746022 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.746256 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.746239 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.746350 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.746242 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:52.747043 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.747028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:52.747131 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.747040 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gng82\"" Apr 24 22:29:52.748194 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.747923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.748194 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.748079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.751196 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.751060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.753130 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753111 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:52.753412 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753399 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.753585 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753573 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.753665 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753646 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.753843 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753828 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7xlk2\"" Apr 24 22:29:52.753910 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.753892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:52.754185 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.754167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.754245 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.754202 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v5cl9\"" Apr 24 22:29:52.754723 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.754707 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4847g\"" Apr 24 22:29:52.755136 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755118 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.755198 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755155 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:52.755905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755340 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:52.755905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:52.755905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755630 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:52.755905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.755682 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-t2fll\"" Apr 24 22:29:52.756464 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.756439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.758023 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.757661 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:52.758023 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.757763 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:29:52.759139 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-config\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759240 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759240 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-systemd-units\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759347 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.759347 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-slash\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759347 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-ovn\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8pk\" (UniqueName: \"kubernetes.io/projected/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-kube-api-access-dd8pk\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-bin\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-netd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-env-overrides\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.759497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-sys\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-cnibin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-multus-certs\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-sys-fs\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759639 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23caaf68-bf31-4bd4-8417-c00b22900ee2-konnectivity-ca\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-netns\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.759714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-device-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw26z\" (UniqueName: \"kubernetes.io/projected/c4cc25c5-df90-4054-8f91-917b98a3eb96-kube-api-access-xw26z\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23caaf68-bf31-4bd4-8417-c00b22900ee2-agent-certs\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-var-lib-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-etc-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-script-lib\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-var-lib-kubelet\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-socket-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.759990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-kubelet\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760042 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-node-log\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-systemd\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-kubelet\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-etc-selinux\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-log-socket\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/939c10b8-9c56-4502-a65a-30206c40fa9d-serviceca\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-multus\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-etc-kubernetes\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.760390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-systemd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-lib-modules\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-tmp\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jsms\" (UniqueName: \"kubernetes.io/projected/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-kube-api-access-2jsms\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-cni-binary-copy\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-modprobe-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-system-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-conf\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939c10b8-9c56-4502-a65a-30206c40fa9d-host\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-os-release\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-socket-dir-parent\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-k8s-cni-cncf-io\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-daemon-config\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-host\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-bin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-tuned\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760838 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-conf-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.760852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-kubernetes\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-hostroot\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-registration-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-netns\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovn-node-metrics-cert\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.760983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-run\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj5nj\" (UniqueName: \"kubernetes.io/projected/939c10b8-9c56-4502-a65a-30206c40fa9d-kube-api-access-xj5nj\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kvvn\" (UniqueName: \"kubernetes.io/projected/eebbd623-0913-43a3-ad50-13184bd5baaa-kube-api-access-5kvvn\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysconfig\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.761644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.761154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.762641 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.762619 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.762748 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.762721 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.763143 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763126 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.763217 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763188 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sj7g7\"" Apr 24 22:29:52.763217 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763201 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:52.763325 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763283 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.763894 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tsrpb\"" Apr 24 22:29:52.763894 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.763815 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.765242 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.765219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:52.765319 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.765279 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:29:52.765864 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.765847 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:52.765955 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.765928 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:52.766175 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.766160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86tzl\"" Apr 24 22:29:52.793166 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.793141 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:51 +0000 UTC" deadline="2027-10-27 15:54:49.660067693 +0000 UTC" Apr 24 22:29:52.793266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.793165 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13217h24m56.866905216s" Apr 24 22:29:52.849767 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.849742 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:52.861658 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-tuned\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-conf-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-kubernetes\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-hostroot\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74bjx\" (UniqueName: \"kubernetes.io/projected/319510cc-198d-4518-b16b-5a7c26091db0-kube-api-access-74bjx\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861731 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-conf-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-registration-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-netns\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-kubernetes\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-hostroot\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.861809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-registration-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovn-node-metrics-cert\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-netns\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-run\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj5nj\" (UniqueName: \"kubernetes.io/projected/939c10b8-9c56-4502-a65a-30206c40fa9d-kube-api-access-xj5nj\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.861995 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kvvn\" (UniqueName: \"kubernetes.io/projected/eebbd623-0913-43a3-ad50-13184bd5baaa-kube-api-access-5kvvn\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysconfig\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-config\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-run\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.862292 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-systemd-units\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-systemd-units\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysconfig\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-slash\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-ovn\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-slash\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8pk\" (UniqueName: \"kubernetes.io/projected/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-kube-api-access-dd8pk\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-ovn\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-bin\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-netd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-env-overrides\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-sys\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.863164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-cnibin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-multus-certs\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-netd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-sys\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8acc4f4f-831f-4c10-a187-01230734276e-tmp-dir\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-cni-bin\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-sys-fs\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-multus-certs\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23caaf68-bf31-4bd4-8417-c00b22900ee2-konnectivity-ca\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-config\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-cnibin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-netns\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-sys-fs\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-os-release\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-netns\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.863987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-binary-copy\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-device-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-device-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.862975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319510cc-198d-4518-b16b-5a7c26091db0-host-slash\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npksk\" (UniqueName: \"kubernetes.io/projected/ca9127c4-a533-44bc-9593-d1308d3b463f-kube-api-access-npksk\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjc6\" (UniqueName: \"kubernetes.io/projected/8acc4f4f-831f-4c10-a187-01230734276e-kube-api-access-pvjc6\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw26z\" (UniqueName: \"kubernetes.io/projected/c4cc25c5-df90-4054-8f91-917b98a3eb96-kube-api-access-xw26z\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23caaf68-bf31-4bd4-8417-c00b22900ee2-agent-certs\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-var-lib-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-etc-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-script-lib\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-var-lib-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-var-lib-kubelet\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-var-lib-kubelet\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-socket-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23caaf68-bf31-4bd4-8417-c00b22900ee2-konnectivity-ca\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.864919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863263 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-etc-openvswitch\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-kubelet\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-node-log\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863378 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-socket-dir\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-host-kubelet\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-systemd\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-systemd\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-kubelet\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-etc-selinux\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-kubelet\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-log-socket\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/939c10b8-9c56-4502-a65a-30206c40fa9d-serviceca\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-multus\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-etc-kubernetes\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-systemd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-lib-modules\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-tmp\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-env-overrides\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.865761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jsms\" (UniqueName: \"kubernetes.io/projected/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-kube-api-access-2jsms\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-cni-binary-copy\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mhg\" (UniqueName: \"kubernetes.io/projected/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-kube-api-access-65mhg\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-multus\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-modprobe-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-node-log\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-system-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-cnibin\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863801 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c4cc25c5-df90-4054-8f91-917b98a3eb96-etc-selinux\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863808 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-log-socket\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-conf\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863834 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovnkube-script-lib\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863926 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-sysctl-conf\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-modprobe-d\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.863999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939c10b8-9c56-4502-a65a-30206c40fa9d-host\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-os-release\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.866475 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-system-cni-dir\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-socket-dir-parent\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-k8s-cni-cncf-io\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-daemon-config\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319510cc-198d-4518-b16b-5a7c26091db0-iptables-alerter-script\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-host\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-run-systemd\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-lib-modules\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-etc-kubernetes\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939c10b8-9c56-4502-a65a-30206c40fa9d-host\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-os-release\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-socket-dir-parent\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864505 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-bin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-cni-binary-copy\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-run-k8s-cni-cncf-io\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-host\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-system-cni-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eebbd623-0913-43a3-ad50-13184bd5baaa-host-var-lib-cni-bin\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8acc4f4f-831f-4c10-a187-01230734276e-hosts-file\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.864745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/939c10b8-9c56-4502-a65a-30206c40fa9d-serviceca\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.865258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eebbd623-0913-43a3-ad50-13184bd5baaa-multus-daemon-config\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.865774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-etc-tuned\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.865892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-ovn-node-metrics-cert\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.866457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23caaf68-bf31-4bd4-8417-c00b22900ee2-agent-certs\") pod \"konnectivity-agent-l24ds\" (UID: \"23caaf68-bf31-4bd4-8417-c00b22900ee2\") " pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:52.867951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.866657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-tmp\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.872461 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.872406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kvvn\" (UniqueName: \"kubernetes.io/projected/eebbd623-0913-43a3-ad50-13184bd5baaa-kube-api-access-5kvvn\") pod \"multus-ppjh8\" (UID: \"eebbd623-0913-43a3-ad50-13184bd5baaa\") " pod="openshift-multus/multus-ppjh8" Apr 24 22:29:52.872845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.872825 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8pk\" (UniqueName: \"kubernetes.io/projected/ce4f31c6-c297-4d19-b0f3-f05c45c17a8e-kube-api-access-dd8pk\") pod \"ovnkube-node-b4zlm\" (UID: \"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:52.873645 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.873597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj5nj\" (UniqueName: \"kubernetes.io/projected/939c10b8-9c56-4502-a65a-30206c40fa9d-kube-api-access-xj5nj\") pod \"node-ca-xx2md\" (UID: \"939c10b8-9c56-4502-a65a-30206c40fa9d\") " pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:52.874673 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.874647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw26z\" (UniqueName: \"kubernetes.io/projected/c4cc25c5-df90-4054-8f91-917b98a3eb96-kube-api-access-xw26z\") pod \"aws-ebs-csi-driver-node-9jd7m\" (UID: \"c4cc25c5-df90-4054-8f91-917b98a3eb96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:52.875208 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.875187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jsms\" (UniqueName: \"kubernetes.io/projected/ea8f7937-1cbc-424c-ba7e-220e1d538dbe-kube-api-access-2jsms\") pod \"tuned-klbmp\" (UID: \"ea8f7937-1cbc-424c-ba7e-220e1d538dbe\") " pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:52.965504 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:52.965504 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8acc4f4f-831f-4c10-a187-01230734276e-tmp-dir\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.965504 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-os-release\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965504 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-binary-copy\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319510cc-198d-4518-b16b-5a7c26091db0-host-slash\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npksk\" (UniqueName: \"kubernetes.io/projected/ca9127c4-a533-44bc-9593-d1308d3b463f-kube-api-access-npksk\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjc6\" (UniqueName: \"kubernetes.io/projected/8acc4f4f-831f-4c10-a187-01230734276e-kube-api-access-pvjc6\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965588 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65mhg\" (UniqueName: \"kubernetes.io/projected/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-kube-api-access-65mhg\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-cnibin\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-os-release\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319510cc-198d-4518-b16b-5a7c26091db0-iptables-alerter-script\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-system-cni-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8acc4f4f-831f-4c10-a187-01230734276e-hosts-file\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8acc4f4f-831f-4c10-a187-01230734276e-tmp-dir\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.965809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74bjx\" (UniqueName: \"kubernetes.io/projected/319510cc-198d-4518-b16b-5a7c26091db0-kube-api-access-74bjx\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-system-cni-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.965890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319510cc-198d-4518-b16b-5a7c26091db0-host-slash\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.965942 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.966030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:53.466008289 +0000 UTC m=+3.189600838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8acc4f4f-831f-4c10-a187-01230734276e-hosts-file\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-cnibin\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-cni-binary-copy\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca9127c4-a533-44bc-9593-d1308d3b463f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.966400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319510cc-198d-4518-b16b-5a7c26091db0-iptables-alerter-script\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.966804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.966571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca9127c4-a533-44bc-9593-d1308d3b463f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.980612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.980589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjc6\" (UniqueName: \"kubernetes.io/projected/8acc4f4f-831f-4c10-a187-01230734276e-kube-api-access-pvjc6\") pod \"node-resolver-knznw\" (UID: \"8acc4f4f-831f-4c10-a187-01230734276e\") " pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:52.982193 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.982171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npksk\" (UniqueName: \"kubernetes.io/projected/ca9127c4-a533-44bc-9593-d1308d3b463f-kube-api-access-npksk\") pod \"multus-additional-cni-plugins-7rncr\" (UID: \"ca9127c4-a533-44bc-9593-d1308d3b463f\") " pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:52.982977 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.982913 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:52.982977 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.982932 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:52.982977 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.982944 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:52.983245 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:52.983026 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:29:53.483009483 +0000 UTC m=+3.206602031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:52.984393 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.984374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74bjx\" (UniqueName: \"kubernetes.io/projected/319510cc-198d-4518-b16b-5a7c26091db0-kube-api-access-74bjx\") pod \"iptables-alerter-xj45v\" (UID: \"319510cc-198d-4518-b16b-5a7c26091db0\") " pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:52.984588 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:52.984571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mhg\" (UniqueName: \"kubernetes.io/projected/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-kube-api-access-65mhg\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:53.051395 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.051348 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:29:53.061334 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.061309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-klbmp" Apr 24 22:29:53.067984 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.067952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xx2md" Apr 24 22:29:53.073559 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.073534 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ppjh8" Apr 24 22:29:53.080154 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.080135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" Apr 24 22:29:53.086674 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.086652 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:29:53.094183 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.094163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-knznw" Apr 24 22:29:53.100705 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.100688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xj45v" Apr 24 22:29:53.106264 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.106242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7rncr" Apr 24 22:29:53.470174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.470149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:53.472424 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.470547 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:53.472424 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.470629 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.470608814 +0000 UTC m=+4.194201380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:53.530489 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.530461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4cc25c5_df90_4054_8f91_917b98a3eb96.slice/crio-ea5524fd77ea545cf4bc2a132927b204f65f48b26624da288db705e85f3d183b WatchSource:0}: Error finding container ea5524fd77ea545cf4bc2a132927b204f65f48b26624da288db705e85f3d183b: Status 404 returned error can't find the container with id ea5524fd77ea545cf4bc2a132927b204f65f48b26624da288db705e85f3d183b Apr 24 22:29:53.531439 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.531414 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeebbd623_0913_43a3_ad50_13184bd5baaa.slice/crio-e20a8f5d158c63a0af04edcae47b2d6e082dc891296fdd024900bbef2e89fcdd WatchSource:0}: Error finding container e20a8f5d158c63a0af04edcae47b2d6e082dc891296fdd024900bbef2e89fcdd: Status 404 returned error can't find the container with id e20a8f5d158c63a0af04edcae47b2d6e082dc891296fdd024900bbef2e89fcdd Apr 24 22:29:53.532781 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.532746 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23caaf68_bf31_4bd4_8417_c00b22900ee2.slice/crio-d9eb8ad35006c2fe6f20fed86ed43ab8f2cd155002734009a5edac915ff57b63 WatchSource:0}: Error finding container d9eb8ad35006c2fe6f20fed86ed43ab8f2cd155002734009a5edac915ff57b63: Status 404 returned error can't find the container with id d9eb8ad35006c2fe6f20fed86ed43ab8f2cd155002734009a5edac915ff57b63 Apr 24 22:29:53.535277 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.535256 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4f31c6_c297_4d19_b0f3_f05c45c17a8e.slice/crio-686fa99876b572b933ccac734ab35a62235f9176cc046db12ce6fae8fe0e1381 WatchSource:0}: Error finding container 686fa99876b572b933ccac734ab35a62235f9176cc046db12ce6fae8fe0e1381: Status 404 returned error can't find the container with id 686fa99876b572b933ccac734ab35a62235f9176cc046db12ce6fae8fe0e1381 Apr 24 22:29:53.536727 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.536680 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8f7937_1cbc_424c_ba7e_220e1d538dbe.slice/crio-1b84d1a37253d230ba16b05e8a63818fc2ee8f3198506eec41b721cfbc47e560 WatchSource:0}: Error finding container 1b84d1a37253d230ba16b05e8a63818fc2ee8f3198506eec41b721cfbc47e560: Status 404 returned error can't find the container with id 1b84d1a37253d230ba16b05e8a63818fc2ee8f3198506eec41b721cfbc47e560 Apr 24 22:29:53.537993 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.537882 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca9127c4_a533_44bc_9593_d1308d3b463f.slice/crio-6d23f132f5e6114b1e1e34ffce6c52cd327e421d70b0d723fcfa65d205c0d15e WatchSource:0}: Error finding container 6d23f132f5e6114b1e1e34ffce6c52cd327e421d70b0d723fcfa65d205c0d15e: Status 404 returned error can't find the container with id 6d23f132f5e6114b1e1e34ffce6c52cd327e421d70b0d723fcfa65d205c0d15e Apr 24 22:29:53.540250 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.540113 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939c10b8_9c56_4502_a65a_30206c40fa9d.slice/crio-956ec85e36545c86db883ba383b6c5e31da5b009f5bb3d8433460d4cd6e265c7 WatchSource:0}: Error finding container 956ec85e36545c86db883ba383b6c5e31da5b009f5bb3d8433460d4cd6e265c7: Status 404 returned error can't find the container with id 956ec85e36545c86db883ba383b6c5e31da5b009f5bb3d8433460d4cd6e265c7 Apr 24 22:29:53.541453 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.541431 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319510cc_198d_4518_b16b_5a7c26091db0.slice/crio-6bdfb0a4c458590541d11c271d6eeaea02006b6837a0c6189067a99f8bc916ef WatchSource:0}: Error finding container 6bdfb0a4c458590541d11c271d6eeaea02006b6837a0c6189067a99f8bc916ef: Status 404 returned error can't find the container with id 6bdfb0a4c458590541d11c271d6eeaea02006b6837a0c6189067a99f8bc916ef Apr 24 22:29:53.543085 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:29:53.543064 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8acc4f4f_831f_4c10_a187_01230734276e.slice/crio-64e3195b981b88c58ae8ebf3dbbe51e9c14e2ab3abb1891e822bf0ef792e5715 WatchSource:0}: Error finding container 64e3195b981b88c58ae8ebf3dbbe51e9c14e2ab3abb1891e822bf0ef792e5715: Status 404 returned error can't find the container with id 64e3195b981b88c58ae8ebf3dbbe51e9c14e2ab3abb1891e822bf0ef792e5715 Apr 24 22:29:53.571116 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.571096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:53.571263 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.571243 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:53.571263 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.571266 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:53.571392 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.571278 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:53.571392 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:53.571333 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.571315309 +0000 UTC m=+4.294907875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:53.793654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.793576 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:51 +0000 UTC" deadline="2028-01-28 14:07:45.269158098 +0000 UTC" Apr 24 22:29:53.793654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.793606 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15447h37m51.475554444s" Apr 24 22:29:53.849685 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.849647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xj45v" event={"ID":"319510cc-198d-4518-b16b-5a7c26091db0","Type":"ContainerStarted","Data":"6bdfb0a4c458590541d11c271d6eeaea02006b6837a0c6189067a99f8bc916ef"} Apr 24 22:29:53.850673 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.850628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xx2md" event={"ID":"939c10b8-9c56-4502-a65a-30206c40fa9d","Type":"ContainerStarted","Data":"956ec85e36545c86db883ba383b6c5e31da5b009f5bb3d8433460d4cd6e265c7"} Apr 24 22:29:53.852245 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.852220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"686fa99876b572b933ccac734ab35a62235f9176cc046db12ce6fae8fe0e1381"} Apr 24 22:29:53.854551 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.854506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l24ds" event={"ID":"23caaf68-bf31-4bd4-8417-c00b22900ee2","Type":"ContainerStarted","Data":"d9eb8ad35006c2fe6f20fed86ed43ab8f2cd155002734009a5edac915ff57b63"} Apr 24 22:29:53.855954 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.855891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-klbmp" event={"ID":"ea8f7937-1cbc-424c-ba7e-220e1d538dbe","Type":"ContainerStarted","Data":"1b84d1a37253d230ba16b05e8a63818fc2ee8f3198506eec41b721cfbc47e560"} Apr 24 22:29:53.857296 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.857274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-knznw" event={"ID":"8acc4f4f-831f-4c10-a187-01230734276e","Type":"ContainerStarted","Data":"64e3195b981b88c58ae8ebf3dbbe51e9c14e2ab3abb1891e822bf0ef792e5715"} Apr 24 22:29:53.860426 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.859274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerStarted","Data":"6d23f132f5e6114b1e1e34ffce6c52cd327e421d70b0d723fcfa65d205c0d15e"} Apr 24 22:29:53.860532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.860414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppjh8" event={"ID":"eebbd623-0913-43a3-ad50-13184bd5baaa","Type":"ContainerStarted","Data":"e20a8f5d158c63a0af04edcae47b2d6e082dc891296fdd024900bbef2e89fcdd"} Apr 24 22:29:53.861770 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.861692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" event={"ID":"c4cc25c5-df90-4054-8f91-917b98a3eb96","Type":"ContainerStarted","Data":"ea5524fd77ea545cf4bc2a132927b204f65f48b26624da288db705e85f3d183b"} Apr 24 22:29:53.866952 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:53.866926 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" event={"ID":"71c8bd7a095056144ad8091ca3c68103","Type":"ContainerStarted","Data":"589a6fc8052170dc51e0aa9c04310ba94f92939a231b95f6c8364add97dd85db"} Apr 24 22:29:54.478430 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.477876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:54.478430 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.478034 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:54.478430 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.478096 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.478077959 +0000 UTC m=+6.201670513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:54.579858 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.579229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:54.579858 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.579395 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:54.579858 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.579414 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:54.579858 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.579425 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:54.579858 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.579481 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.579463476 +0000 UTC m=+6.303056033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:54.844703 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.844198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:54.844703 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.844317 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:29:54.844703 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.844581 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:54.844703 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:54.844684 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:29:54.884948 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.884912 2574 generic.go:358] "Generic (PLEG): container finished" podID="17556b0047f1a6a15f4b7d5854560826" containerID="8d5f2247f358f8afea6c2b69e880d54fd5074c99505b71b334ca92b36385f934" exitCode=0 Apr 24 22:29:54.885755 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.885725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerDied","Data":"8d5f2247f358f8afea6c2b69e880d54fd5074c99505b71b334ca92b36385f934"} Apr 24 22:29:54.959242 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:54.959166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" podStartSLOduration=2.959145803 podStartE2EDuration="2.959145803s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:53.885375711 +0000 UTC m=+3.608968282" watchObservedRunningTime="2026-04-24 22:29:54.959145803 +0000 UTC m=+4.682738374" Apr 24 22:29:55.038857 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.038125 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-crl79"] Apr 24 22:29:55.041405 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.041177 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.041405 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:55.041256 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:29:55.083239 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.083205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-dbus\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.083393 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.083257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-kubelet-config\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.083393 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.083319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.184484 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.184450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.184641 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.184522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-dbus\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.184641 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.184551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-kubelet-config\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.184784 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.184652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-kubelet-config\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.184784 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:55.184755 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.184883 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:55.184812 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:55.684793928 +0000 UTC m=+5.408386476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.185115 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.185094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/53fd54b3-2948-44d4-9d51-6c630d4b0a08-dbus\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.689756 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.689716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:55.690033 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:55.689904 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.690033 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:55.689986 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.689953014 +0000 UTC m=+6.413545582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.896503 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:55.896467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerStarted","Data":"3e8bd592f10061b462b3894155c475024298fcf68a5bc35eb2acea301091ea05"} Apr 24 22:29:56.496897 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.496380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:56.496897 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.496516 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.496897 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.496576 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.496557792 +0000 UTC m=+10.220150354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.597767 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.597722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:56.597924 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.597881 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:56.597924 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.597900 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:56.597924 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.597912 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.598108 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.597989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.597970337 +0000 UTC m=+10.321562899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.698702 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.698648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:56.698877 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.698785 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:56.698877 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.698844 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.698826758 +0000 UTC m=+8.422419315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:56.842614 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.842536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:56.842614 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.842575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:56.842839 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:56.842536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:56.842839 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.842657 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:29:56.842839 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.842808 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:29:56.843008 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:56.842901 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:29:58.717262 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:58.717223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:58.717698 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:58.717393 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:58.717698 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:58.717471 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.717450714 +0000 UTC m=+12.441043266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:58.841870 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:58.841837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:29:58.842054 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:58.841985 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:29:58.842054 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:58.842026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:29:58.842158 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:29:58.842035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:29:58.842189 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:58.842155 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:29:58.842263 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:29:58.842213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:00.532476 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:00.532263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:00.532943 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.532432 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.532943 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.532580 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:08.532563539 +0000 UTC m=+18.256156090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.633390 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:00.633276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:00.633590 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.633436 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:00.633590 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.633455 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:00.633590 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.633466 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.633590 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.633528 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:30:08.633511096 +0000 UTC m=+18.357103662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.845689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:00.845615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:00.845689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:00.845643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:00.845904 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:00.845630 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:00.845904 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.845739 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:00.845904 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.845826 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:00.846065 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:00.845914 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:02.749410 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:02.749318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:02.749827 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:02.749485 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:02.749827 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:02.749571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:10.749549636 +0000 UTC m=+20.473142187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:02.841979 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:02.841930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:02.842154 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:02.842098 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:02.842211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:02.842186 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:02.842323 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:02.842299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:02.842386 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:02.842356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:02.842465 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:02.842435 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:04.845830 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:04.845797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:04.846333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:04.845842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:04.846333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:04.845894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:04.846333 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:04.846010 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:04.846333 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:04.846145 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:04.846333 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:04.846250 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:06.844550 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:06.844521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:06.844994 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:06.844521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:06.844994 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:06.844622 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:06.844994 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:06.844521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:06.844994 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:06.844696 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:06.844994 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:06.844803 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:08.595137 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:08.595099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:08.595614 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.595242 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:08.595614 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.595301 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.595284978 +0000 UTC m=+34.318877540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:08.695920 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:08.695891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:08.696114 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.696092 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:08.696162 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.696121 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:08.696162 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.696134 2574 projected.go:194] Error preparing data for projected volume kube-api-access-sjvd6 for pod openshift-network-diagnostics/network-check-target-rscxl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:08.696246 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.696193 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6 podName:f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.696175534 +0000 UTC m=+34.419768088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjvd6" (UniqueName: "kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6") pod "network-check-target-rscxl" (UID: "f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:08.841798 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:08.841771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:08.841940 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:08.841771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:08.841940 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.841873 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:08.842048 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:08.841776 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:08.842048 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.841984 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:08.842048 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:08.842018 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:10.813377 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.813184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:10.813377 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:10.813309 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:10.813377 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:10.813356 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret podName:53fd54b3-2948-44d4-9d51-6c630d4b0a08 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.813340137 +0000 UTC m=+36.536932702 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret") pod "global-pull-secret-syncer-crl79" (UID: "53fd54b3-2948-44d4-9d51-6c630d4b0a08") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:10.843128 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.843104 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:10.843245 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:10.843224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:10.843321 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.843306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:10.843404 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:10.843384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:10.843462 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.843453 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:10.843556 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:10.843537 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:10.933242 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.932153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l24ds" event={"ID":"23caaf68-bf31-4bd4-8417-c00b22900ee2","Type":"ContainerStarted","Data":"6d3f2da1a1107f122bb85714f43aaee98fc8a433a5575539d606d28cdfe2f65d"} Apr 24 22:30:10.940298 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.939776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppjh8" event={"ID":"eebbd623-0913-43a3-ad50-13184bd5baaa","Type":"ContainerStarted","Data":"649112a63b68560aae52366b6b59d459c93c0b1bdc77ee145eb99f5962e0664a"} Apr 24 22:30:10.943310 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.943260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" event={"ID":"c4cc25c5-df90-4054-8f91-917b98a3eb96","Type":"ContainerStarted","Data":"55e8b1ca1b26b7c2795f1dd84ff604e6a514bff693c454910a94241440cf56fe"} Apr 24 22:30:10.959466 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.958080 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l24ds" podStartSLOduration=8.642956985 podStartE2EDuration="20.958061897s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.534464192 +0000 UTC m=+3.258056739" lastFinishedPulling="2026-04-24 22:30:05.849569091 +0000 UTC m=+15.573161651" observedRunningTime="2026-04-24 22:30:10.957005193 +0000 UTC m=+20.680597763" watchObservedRunningTime="2026-04-24 22:30:10.958061897 +0000 UTC m=+20.681654468" Apr 24 22:30:10.959466 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:10.958236 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" podStartSLOduration=18.958227329 podStartE2EDuration="18.958227329s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:55.911735994 +0000 UTC m=+5.635328565" watchObservedRunningTime="2026-04-24 22:30:10.958227329 +0000 UTC m=+20.681819900" Apr 24 22:30:11.003243 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.003201 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-klbmp" podStartSLOduration=3.891126777 podStartE2EDuration="21.003187097s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.540389006 +0000 UTC m=+3.263981557" lastFinishedPulling="2026-04-24 22:30:10.652449322 +0000 UTC m=+20.376041877" observedRunningTime="2026-04-24 22:30:10.978574914 +0000 UTC m=+20.702167485" watchObservedRunningTime="2026-04-24 22:30:11.003187097 +0000 UTC m=+20.726779666" Apr 24 22:30:11.003419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.003399 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ppjh8" podStartSLOduration=3.853250639 podStartE2EDuration="21.003394753s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.533422744 +0000 UTC m=+3.257015292" lastFinishedPulling="2026-04-24 22:30:10.683566853 +0000 UTC m=+20.407159406" observedRunningTime="2026-04-24 22:30:11.003129383 +0000 UTC m=+20.726721976" watchObservedRunningTime="2026-04-24 22:30:11.003394753 +0000 UTC m=+20.726987532" Apr 24 22:30:11.781332 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.781138 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:11.819686 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.819579 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:11.781329221Z","UUID":"cdfe97e7-6c82-47c9-9a08-b1650bcdeb0e","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:11.820971 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.820940 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:11.821056 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.820980 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:11.939837 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.939815 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:30:11.940571 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.940539 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:30:11.946023 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.945988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xx2md" event={"ID":"939c10b8-9c56-4502-a65a-30206c40fa9d","Type":"ContainerStarted","Data":"2e0f638bab6a447f4f71aad654be42d9cd9e42f44cd1bce9413682740bca094a"} Apr 24 22:30:11.948161 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"5464de0461714690f8ea7e1a8246310c5196cf6a671edf2d1aa30fd655138d13"} Apr 24 22:30:11.948263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"49354678a3be9b5633b8a40a6124b7a442de86d58b021d9a2411f7c57647b71d"} Apr 24 22:30:11.948263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948180 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"e63aec5c75eaf6536f1ed111ad821ad539aea72e9258dce342d8291bc688a6c7"} Apr 24 22:30:11.948263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"38a544b707a5ecfabb1d993c5f43a58b7d7c4f00960700fc11e5bdbfd7a85604"} Apr 24 22:30:11.948263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"db7995b8b062cfaa82fe3950bf6fc097b8f553dd708eb4316cb8c77ef546de28"} Apr 24 22:30:11.948263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.948219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"d2081ca3d4ddb6282b0b6153e7498e3b72e41e06d96d955064b8453b1f344581"} Apr 24 22:30:11.949295 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.949276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-klbmp" event={"ID":"ea8f7937-1cbc-424c-ba7e-220e1d538dbe","Type":"ContainerStarted","Data":"4d9158b8b3ddb2aa1f9716edea5746131eaef80e90e11bb3df9d73691e824b05"} Apr 24 22:30:11.950514 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.950497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-knznw" event={"ID":"8acc4f4f-831f-4c10-a187-01230734276e","Type":"ContainerStarted","Data":"c31a125b9c7b3cd41a7a9e137c160245e75d5ee00b7ed8731d7c5ed3e41ac324"} Apr 24 22:30:11.951724 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.951699 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="b002d1f19f96aa1121b4823d6a543b59ca093d6f2625e4888e251b596a6b4bea" exitCode=0 Apr 24 22:30:11.951799 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.951763 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"b002d1f19f96aa1121b4823d6a543b59ca093d6f2625e4888e251b596a6b4bea"} Apr 24 22:30:11.955779 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.955760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" event={"ID":"c4cc25c5-df90-4054-8f91-917b98a3eb96","Type":"ContainerStarted","Data":"518c1c8fd583877ca7c2478536b180d00486eb1f71470f7fdcf37c55a8392826"} Apr 24 22:30:11.976566 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:11.976522 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-knznw" podStartSLOduration=4.178392534 podStartE2EDuration="20.976510517s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.54465333 +0000 UTC m=+3.268245883" lastFinishedPulling="2026-04-24 22:30:10.342771317 +0000 UTC m=+20.066363866" observedRunningTime="2026-04-24 22:30:11.976185961 +0000 UTC m=+21.699778531" watchObservedRunningTime="2026-04-24 22:30:11.976510517 +0000 UTC m=+21.700103086" Apr 24 22:30:12.026046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.025972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xx2md" podStartSLOduration=5.226050456 podStartE2EDuration="22.025945794s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.542871596 +0000 UTC m=+3.266464144" lastFinishedPulling="2026-04-24 22:30:10.34276692 +0000 UTC m=+20.066359482" observedRunningTime="2026-04-24 22:30:12.025746065 +0000 UTC m=+21.749338636" watchObservedRunningTime="2026-04-24 22:30:12.025945794 +0000 UTC m=+21.749538363" Apr 24 22:30:12.842595 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.842502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:12.842595 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.842565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:12.843237 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:12.842677 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:12.843237 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:12.842726 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:12.843237 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.842779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:12.843237 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:12.842853 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:12.959632 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.959581 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" event={"ID":"c4cc25c5-df90-4054-8f91-917b98a3eb96","Type":"ContainerStarted","Data":"9ada2c2fc2aec565391df8ace1ae0d77cc642d17b2aece728d8f701cf0013904"} Apr 24 22:30:12.960974 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.960935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xj45v" event={"ID":"319510cc-198d-4518-b16b-5a7c26091db0","Type":"ContainerStarted","Data":"0f2dbe1b8e6d0242f18f65eb0be0ca942c4e3a1036ba0478c62cd30fc7d4d4cb"} Apr 24 22:30:12.961080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.960983 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 22:30:12.985211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:12.985173 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9jd7m" podStartSLOduration=4.002597569 podStartE2EDuration="22.985160971s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.532655981 +0000 UTC m=+3.256248529" lastFinishedPulling="2026-04-24 22:30:12.51521937 +0000 UTC m=+22.238811931" observedRunningTime="2026-04-24 22:30:12.984480534 +0000 UTC m=+22.708073105" watchObservedRunningTime="2026-04-24 22:30:12.985160971 +0000 UTC m=+22.708753524" Apr 24 22:30:13.008172 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:13.008125 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xj45v" podStartSLOduration=5.899683471 podStartE2EDuration="23.00810984s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.544023764 +0000 UTC m=+3.267616312" lastFinishedPulling="2026-04-24 22:30:10.652450124 +0000 UTC m=+20.376042681" observedRunningTime="2026-04-24 22:30:13.007864866 +0000 UTC m=+22.731457437" watchObservedRunningTime="2026-04-24 22:30:13.00810984 +0000 UTC m=+22.731702411" Apr 24 22:30:13.966542 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:13.966475 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"cd76215b30d1bdb445797152883486ec56f0ef75e5c0deec84e52f953ae45bf9"} Apr 24 22:30:14.841871 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:14.841833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:14.842062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:14.841886 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:14.842062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:14.841908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:14.842062 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:14.842036 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:14.842222 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:14.842116 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:14.842222 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:14.842206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:15.974290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.973935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" event={"ID":"ce4f31c6-c297-4d19-b0f3-f05c45c17a8e","Type":"ContainerStarted","Data":"1800db6b4aa338722b90ac2b80615b846cc457488b9ea68f6ee7937aac66ed13"} Apr 24 22:30:15.974290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.974260 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:15.974290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.974290 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:15.975356 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.974303 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:15.988859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.988726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:15.989296 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:15.989081 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:16.014469 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.014427 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" podStartSLOduration=8.687289639 podStartE2EDuration="26.014414242s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.537027911 +0000 UTC m=+3.260620465" lastFinishedPulling="2026-04-24 22:30:10.864152513 +0000 UTC m=+20.587745068" observedRunningTime="2026-04-24 22:30:16.013776677 +0000 UTC m=+25.737369248" watchObservedRunningTime="2026-04-24 22:30:16.014414242 +0000 UTC m=+25.738006811" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.845535 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:16.845681 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.846185 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:16.846325 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.846401 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:16.847141 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:16.846500 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:16.909026 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.908987 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:30:16.909215 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.909125 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 22:30:16.909525 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.909506 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l24ds" Apr 24 22:30:16.976734 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.976703 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="196c510c854dc6ac11aad849c7a626f7c099181c996b1d045b48c4f0cabca526" exitCode=0 Apr 24 22:30:16.977201 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:16.976788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"196c510c854dc6ac11aad849c7a626f7c099181c996b1d045b48c4f0cabca526"} Apr 24 22:30:17.858888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.858674 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rscxl"] Apr 24 22:30:17.859163 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.859004 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:17.859163 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:17.859133 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:17.861104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.861080 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ztg8"] Apr 24 22:30:17.861224 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.861176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:17.861309 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:17.861285 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:17.861751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.861733 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-crl79"] Apr 24 22:30:17.861823 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.861807 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:17.861915 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:17.861895 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:17.980439 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.980352 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="107a5c704eb259e8a119a5acd93e190f245d670aa0666b561536003f7b29d1c0" exitCode=0 Apr 24 22:30:17.980900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:17.980437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"107a5c704eb259e8a119a5acd93e190f245d670aa0666b561536003f7b29d1c0"} Apr 24 22:30:18.984987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:18.984936 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="9d663784ec8f5d2a467daf9ee076356f4c8b18c39ac31aedfc1bc39ddefa06e3" exitCode=0 Apr 24 22:30:18.985420 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:18.984998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"9d663784ec8f5d2a467daf9ee076356f4c8b18c39ac31aedfc1bc39ddefa06e3"} Apr 24 22:30:19.842219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:19.842149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:19.842219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:19.842178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:19.842416 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:19.842149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:19.842416 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:19.842271 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:19.842416 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:19.842371 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:19.842558 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:19.842463 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:21.842662 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:21.842632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:21.843345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:21.842634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:21.843345 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:21.842772 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:30:21.843345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:21.842632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:21.843345 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:21.842824 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rscxl" podUID="f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc" Apr 24 22:30:21.843345 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:21.842923 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crl79" podUID="53fd54b3-2948-44d4-9d51-6c630d4b0a08" Apr 24 22:30:23.560593 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.560524 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeReady" Apr 24 22:30:23.561026 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.560683 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:23.666501 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.666466 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gt62l"] Apr 24 22:30:23.711427 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.711396 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vffmw"] Apr 24 22:30:23.711601 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.711452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.715127 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.715106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:23.715417 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.715395 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:30:23.715934 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.715917 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:23.729486 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.729467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gt62l"] Apr 24 22:30:23.729486 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.729489 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vffmw"] Apr 24 22:30:23.729654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.729568 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:23.734174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.734142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:23.734303 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.734217 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:23.734303 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.734150 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:23.734626 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.734600 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:30:23.815861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.815782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.815861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.815842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:23.816080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.815875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a25583f-2bb0-4f33-93bd-3e30aec48cee-config-volume\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.816080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.815975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a25583f-2bb0-4f33-93bd-3e30aec48cee-tmp-dir\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.816080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.816019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gj7s\" (UniqueName: \"kubernetes.io/projected/8a25583f-2bb0-4f33-93bd-3e30aec48cee-kube-api-access-6gj7s\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.816195 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.816081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kc6b\" (UniqueName: \"kubernetes.io/projected/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-kube-api-access-2kc6b\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:23.842172 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.842138 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:23.842339 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.842138 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:23.842339 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.842144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:23.846222 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:23.846222 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846202 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:23.846409 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846267 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgksw\"" Apr 24 22:30:23.846409 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846295 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:23.846409 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846311 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:23.846578 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.846560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:30:23.917020 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.916979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kc6b\" (UniqueName: \"kubernetes.io/projected/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-kube-api-access-2kc6b\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:23.917205 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.917205 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:23.917205 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a25583f-2bb0-4f33-93bd-3e30aec48cee-config-volume\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.917205 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a25583f-2bb0-4f33-93bd-3e30aec48cee-tmp-dir\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.917205 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gj7s\" (UniqueName: \"kubernetes.io/projected/8a25583f-2bb0-4f33-93bd-3e30aec48cee-kube-api-access-6gj7s\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.917446 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:23.917205 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:23.917446 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:23.917279 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.417259395 +0000 UTC m=+34.140851947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:23.917446 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:23.917205 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:23.917446 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:23.917315 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.417305457 +0000 UTC m=+34.140898016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:23.917670 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a25583f-2bb0-4f33-93bd-3e30aec48cee-tmp-dir\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.917775 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.917758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a25583f-2bb0-4f33-93bd-3e30aec48cee-config-volume\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.941272 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.941085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gj7s\" (UniqueName: \"kubernetes.io/projected/8a25583f-2bb0-4f33-93bd-3e30aec48cee-kube-api-access-6gj7s\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:23.941389 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:23.941143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kc6b\" (UniqueName: \"kubernetes.io/projected/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-kube-api-access-2kc6b\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:24.420874 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.420832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:24.421063 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.420887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:24.421063 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.421034 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:24.421155 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.421036 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:24.421155 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.421129 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.421114292 +0000 UTC m=+35.144706840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:24.421222 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.421175 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.421158442 +0000 UTC m=+35.144751008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:24.621831 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.621799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:24.622225 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.621927 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:24.622225 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:24.621990 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.621977155 +0000 UTC m=+66.345569702 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : secret "metrics-daemon-secret" not found Apr 24 22:30:24.722747 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.722671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:24.725951 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.725920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvd6\" (UniqueName: \"kubernetes.io/projected/f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc-kube-api-access-sjvd6\") pod \"network-check-target-rscxl\" (UID: \"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc\") " pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:24.759539 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:24.759499 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:25.008980 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:25.008942 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rscxl"] Apr 24 22:30:25.012499 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:30:25.012470 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f16e94_fa06_4d71_bfc7_e6b272e8d1bc.slice/crio-178d0b5c7d017b3e5ad2cbcfac90b72771f8cc7fc176f9dac815b0b3042d571f WatchSource:0}: Error finding container 178d0b5c7d017b3e5ad2cbcfac90b72771f8cc7fc176f9dac815b0b3042d571f: Status 404 returned error can't find the container with id 178d0b5c7d017b3e5ad2cbcfac90b72771f8cc7fc176f9dac815b0b3042d571f Apr 24 22:30:25.427796 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:25.427721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:25.427796 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:25.427763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:25.427953 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:25.427867 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:25.427953 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:25.427920 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:25.427953 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:25.427931 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.427914156 +0000 UTC m=+37.151506704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:25.428075 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:25.427976 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.427948869 +0000 UTC m=+37.151541417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:26.003874 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.003843 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="7855b125e7bca89facbe06f715c5799b6d61a937fea2bc53e1ad8ae40f859431" exitCode=0 Apr 24 22:30:26.004353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.003925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"7855b125e7bca89facbe06f715c5799b6d61a937fea2bc53e1ad8ae40f859431"} Apr 24 22:30:26.005424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.004986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rscxl" event={"ID":"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc","Type":"ContainerStarted","Data":"178d0b5c7d017b3e5ad2cbcfac90b72771f8cc7fc176f9dac815b0b3042d571f"} Apr 24 22:30:26.836763 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.836510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:26.840300 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.840277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/53fd54b3-2948-44d4-9d51-6c630d4b0a08-original-pull-secret\") pod \"global-pull-secret-syncer-crl79\" (UID: \"53fd54b3-2948-44d4-9d51-6c630d4b0a08\") " pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:26.852914 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:26.852892 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crl79" Apr 24 22:30:27.010391 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:27.010351 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca9127c4-a533-44bc-9593-d1308d3b463f" containerID="4f42157a1e86b325591e353803d14d1196cd7c0fd5821f7d584a395fbba8a02d" exitCode=0 Apr 24 22:30:27.010871 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:27.010395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerDied","Data":"4f42157a1e86b325591e353803d14d1196cd7c0fd5821f7d584a395fbba8a02d"} Apr 24 22:30:27.442687 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:27.442645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:27.442829 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:27.442696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:27.442829 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:27.442805 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:27.442905 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:27.442856 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.442843255 +0000 UTC m=+41.166435803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:27.442905 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:27.442805 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:27.442991 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:27.442918 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.442905925 +0000 UTC m=+41.166498479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:27.857186 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:27.857160 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-crl79"] Apr 24 22:30:27.948835 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:30:27.948795 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53fd54b3_2948_44d4_9d51_6c630d4b0a08.slice/crio-11146310b8feeb335c48e5cc8cd88eaa128648fd8442f56ff09774ff40fb2c9a WatchSource:0}: Error finding container 11146310b8feeb335c48e5cc8cd88eaa128648fd8442f56ff09774ff40fb2c9a: Status 404 returned error can't find the container with id 11146310b8feeb335c48e5cc8cd88eaa128648fd8442f56ff09774ff40fb2c9a Apr 24 22:30:28.015286 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:28.015255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7rncr" event={"ID":"ca9127c4-a533-44bc-9593-d1308d3b463f","Type":"ContainerStarted","Data":"8f870a49e3ecaa43ec1f08117c7ba1c4fa343174c2397acdbadd06028395b843"} Apr 24 22:30:28.019686 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:28.019662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-crl79" event={"ID":"53fd54b3-2948-44d4-9d51-6c630d4b0a08","Type":"ContainerStarted","Data":"11146310b8feeb335c48e5cc8cd88eaa128648fd8442f56ff09774ff40fb2c9a"} Apr 24 22:30:28.047026 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:28.046979 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7rncr" podStartSLOduration=5.718917528 podStartE2EDuration="37.046947162s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.542047175 +0000 UTC m=+3.265639730" lastFinishedPulling="2026-04-24 22:30:24.870076814 +0000 UTC m=+34.593669364" observedRunningTime="2026-04-24 22:30:28.043379928 +0000 UTC m=+37.766972499" watchObservedRunningTime="2026-04-24 22:30:28.046947162 +0000 UTC m=+37.770539733" Apr 24 22:30:29.023837 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:29.023594 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rscxl" event={"ID":"f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc","Type":"ContainerStarted","Data":"a9d4a27040242522ebcc21406dfa22cc45a83eeced3807e50ddc565377477461"} Apr 24 22:30:29.024305 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:29.024106 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:30:29.048355 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:29.048290 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rscxl" podStartSLOduration=36.071933486 podStartE2EDuration="39.04827357s" podCreationTimestamp="2026-04-24 22:29:50 +0000 UTC" firstStartedPulling="2026-04-24 22:30:25.014533367 +0000 UTC m=+34.738125919" lastFinishedPulling="2026-04-24 22:30:27.990873452 +0000 UTC m=+37.714466003" observedRunningTime="2026-04-24 22:30:29.046786048 +0000 UTC m=+38.770378618" watchObservedRunningTime="2026-04-24 22:30:29.04827357 +0000 UTC m=+38.771866141" Apr 24 22:30:31.468086 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:31.468051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:31.468086 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:31.468090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:31.468556 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:31.468205 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:31.468556 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:31.468219 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:31.468556 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:31.468258 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:39.468244327 +0000 UTC m=+49.191836875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:31.468556 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:31.468287 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:39.46826766 +0000 UTC m=+49.191860216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:32.031267 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:32.031181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-crl79" event={"ID":"53fd54b3-2948-44d4-9d51-6c630d4b0a08","Type":"ContainerStarted","Data":"5f0fc1b7394e5b3a9a6950485bcd4497f66e73394152e51ffddb35ede53ac7eb"} Apr 24 22:30:32.047654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:32.047605 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-crl79" podStartSLOduration=33.405514917 podStartE2EDuration="37.047591211s" podCreationTimestamp="2026-04-24 22:29:55 +0000 UTC" firstStartedPulling="2026-04-24 22:30:27.979405085 +0000 UTC m=+37.702997633" lastFinishedPulling="2026-04-24 22:30:31.62148138 +0000 UTC m=+41.345073927" observedRunningTime="2026-04-24 22:30:32.046933152 +0000 UTC m=+41.770525747" watchObservedRunningTime="2026-04-24 22:30:32.047591211 +0000 UTC m=+41.771183775" Apr 24 22:30:39.521363 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:39.521323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:39.521363 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:39.521369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:39.521873 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:39.521477 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:39.521873 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:39.521492 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:39.521873 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:39.521533 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:55.521518089 +0000 UTC m=+65.245110637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:39.521873 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:39.521566 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:30:55.521547681 +0000 UTC m=+65.245140231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:47.997819 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:47.997792 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4zlm" Apr 24 22:30:55.523971 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:55.523933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:30:55.524423 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:55.523992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:30:55.524423 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:55.524070 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:55.524423 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:55.524082 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:55.524423 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:55.524121 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:31:27.524105979 +0000 UTC m=+97.247698527 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:30:55.524423 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:55.524134 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:31:27.524128169 +0000 UTC m=+97.247720716 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:30:56.631087 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:30:56.631050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:30:56.631483 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:56.631191 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:56.631483 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:30:56.631267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:00.631251492 +0000 UTC m=+130.354844045 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : secret "metrics-daemon-secret" not found Apr 24 22:31:01.030827 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:31:01.030796 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rscxl" Apr 24 22:31:27.543214 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:31:27.543166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:31:27.543214 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:31:27.543224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:31:27.543633 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:31:27.543314 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:27.543633 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:31:27.543378 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert podName:8eff2e0b-1c86-4882-8b55-3fb02eb38ebe nodeName:}" failed. No retries permitted until 2026-04-24 22:32:31.543363764 +0000 UTC m=+161.266956312 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert") pod "ingress-canary-vffmw" (UID: "8eff2e0b-1c86-4882-8b55-3fb02eb38ebe") : secret "canary-serving-cert" not found Apr 24 22:31:27.543633 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:31:27.543314 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:27.543633 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:31:27.543465 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls podName:8a25583f-2bb0-4f33-93bd-3e30aec48cee nodeName:}" failed. No retries permitted until 2026-04-24 22:32:31.543448046 +0000 UTC m=+161.267040595 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls") pod "dns-default-gt62l" (UID: "8a25583f-2bb0-4f33-93bd-3e30aec48cee") : secret "dns-default-metrics-tls" not found Apr 24 22:32:00.662105 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:00.662055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:32:00.662581 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:00.662192 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:32:00.662581 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:00.662258 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs podName:d0c4cf71-fe26-4a65-bc22-b98bb5827d73 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:02.662242538 +0000 UTC m=+252.385835086 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs") pod "network-metrics-daemon-8ztg8" (UID: "d0c4cf71-fe26-4a65-bc22-b98bb5827d73") : secret "metrics-daemon-secret" not found Apr 24 22:32:14.891270 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.890990 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-m7xn9"] Apr 24 22:32:14.893813 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.893796 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6z9wv"] Apr 24 22:32:14.893953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.893936 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:14.896345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.896329 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs"] Apr 24 22:32:14.896464 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.896450 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:14.899250 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.899193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.899780 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.899745 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 22:32:14.899987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.899949 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9tg6g\"" Apr 24 22:32:14.900165 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.900138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 22:32:14.900852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.900832 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.901125 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.901107 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 22:32:14.901339 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.901323 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.901803 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.901782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:14.903276 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.903258 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-8t8zx\"" Apr 24 22:32:14.903400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.903269 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.903468 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.903310 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 22:32:14.905269 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.905253 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 22:32:14.905421 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.905402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.905520 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.905346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6s6tp\"" Apr 24 22:32:14.905620 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.905266 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.906394 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.906372 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-m7xn9"] Apr 24 22:32:14.906942 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.906877 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 22:32:14.907224 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.907197 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6z9wv"] Apr 24 22:32:14.908912 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.908892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 22:32:14.909011 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.908972 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs"] Apr 24 22:32:14.989580 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.989547 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:32:14.992427 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.992410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:14.996917 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.996900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:32:14.997297 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.997281 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9rsh7\"" Apr 24 22:32:14.998592 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:14.998565 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:32:15.000082 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.000065 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:32:15.020141 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.020116 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:32:15.032780 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.032755 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:32:15.055331 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0595875-db43-466a-aa45-15f3138253c4-serving-cert\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055437 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-config\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.055437 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-serving-cert\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.055513 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-snapshots\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055513 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.055572 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055572 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvjz\" (UniqueName: \"kubernetes.io/projected/e0595875-db43-466a-aa45-15f3138253c4-kube-api-access-bwvjz\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055635 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5ks\" (UniqueName: \"kubernetes.io/projected/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-kube-api-access-th5ks\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.055804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-service-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8t5\" (UniqueName: \"kubernetes.io/projected/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-kube-api-access-zr8t5\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.055804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-tmp\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.055804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.055686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-trusted-ca\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.156822 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0595875-db43-466a-aa45-15f3138253c4-serving-cert\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.156822 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.156822 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-config\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.156822 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-serving-cert\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-snapshots\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvjz\" (UniqueName: \"kubernetes.io/projected/e0595875-db43-466a-aa45-15f3138253c4-kube-api-access-bwvjz\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th5ks\" (UniqueName: \"kubernetes.io/projected/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-kube-api-access-th5ks\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.156944 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.157034 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:32:15.157179 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.157104 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls podName:7c99265f-97ef-4683-8d9c-f7c17dd3f1ec nodeName:}" failed. No retries permitted until 2026-04-24 22:32:15.657081699 +0000 UTC m=+145.380674253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zq5vs" (UID: "7c99265f-97ef-4683-8d9c-f7c17dd3f1ec") : secret "samples-operator-tls" not found Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-service-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8t5\" (UniqueName: \"kubernetes.io/projected/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-kube-api-access-zr8t5\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-tmp\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-snapshots\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.157575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-trusted-ca\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.158046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.158046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-service-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.158046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.158046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-config\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.158046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.157781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0595875-db43-466a-aa45-15f3138253c4-tmp\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.158427 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.158405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-trusted-ca\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.158830 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.158810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0595875-db43-466a-aa45-15f3138253c4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.160067 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.160049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-serving-cert\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.160130 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.160055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0595875-db43-466a-aa45-15f3138253c4-serving-cert\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.165487 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.165467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5ks\" (UniqueName: \"kubernetes.io/projected/2ae54e6c-a291-4b2e-8885-ba0d08f9048c-kube-api-access-th5ks\") pod \"console-operator-9d4b6777b-6z9wv\" (UID: \"2ae54e6c-a291-4b2e-8885-ba0d08f9048c\") " pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.166196 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.166178 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8t5\" (UniqueName: \"kubernetes.io/projected/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-kube-api-access-zr8t5\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.166546 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.166527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvjz\" (UniqueName: \"kubernetes.io/projected/e0595875-db43-466a-aa45-15f3138253c4-kube-api-access-bwvjz\") pod \"insights-operator-585dfdc468-m7xn9\" (UID: \"e0595875-db43-466a-aa45-15f3138253c4\") " pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.207696 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.207662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" Apr 24 22:32:15.213489 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.213460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:15.258987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.258931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259121 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259121 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259121 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259121 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.259345 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.259222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.261072 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.260216 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:15.261072 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.260246 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c7ccbc59-779tb: secret "image-registry-tls" not found Apr 24 22:32:15.261072 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.260323 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls podName:3150c082-9c90-4dd3-ad49-9ddb41172e83 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:15.760303107 +0000 UTC m=+145.483895673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls") pod "image-registry-c7ccbc59-779tb" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83") : secret "image-registry-tls" not found Apr 24 22:32:15.261072 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.260834 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.261331 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.261250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.262219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.262112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.268706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.268384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.272194 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.272134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.274995 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.273926 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.276831 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.276786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.332799 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.332760 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-m7xn9"] Apr 24 22:32:15.335648 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:15.335617 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0595875_db43_466a_aa45_15f3138253c4.slice/crio-0a442ca164e8a7518e873dc97a423e7622f14b07a22048cd06858da5fafa146c WatchSource:0}: Error finding container 0a442ca164e8a7518e873dc97a423e7622f14b07a22048cd06858da5fafa146c: Status 404 returned error can't find the container with id 0a442ca164e8a7518e873dc97a423e7622f14b07a22048cd06858da5fafa146c Apr 24 22:32:15.348080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.348058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6z9wv"] Apr 24 22:32:15.351391 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:15.351366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae54e6c_a291_4b2e_8885_ba0d08f9048c.slice/crio-deaf66ae7a6729b30cbbda1134520ac83f6592cb83db3b635aaa3f09a6320059 WatchSource:0}: Error finding container deaf66ae7a6729b30cbbda1134520ac83f6592cb83db3b635aaa3f09a6320059: Status 404 returned error can't find the container with id deaf66ae7a6729b30cbbda1134520ac83f6592cb83db3b635aaa3f09a6320059 Apr 24 22:32:15.662263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.662229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:15.662455 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.662340 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:32:15.662455 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.662396 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls podName:7c99265f-97ef-4683-8d9c-f7c17dd3f1ec nodeName:}" failed. No retries permitted until 2026-04-24 22:32:16.662380816 +0000 UTC m=+146.385973365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zq5vs" (UID: "7c99265f-97ef-4683-8d9c-f7c17dd3f1ec") : secret "samples-operator-tls" not found Apr 24 22:32:15.763058 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:15.763005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:15.763246 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.763140 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:15.763246 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.763162 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c7ccbc59-779tb: secret "image-registry-tls" not found Apr 24 22:32:15.763246 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:15.763215 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls podName:3150c082-9c90-4dd3-ad49-9ddb41172e83 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:16.763201374 +0000 UTC m=+146.486793922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls") pod "image-registry-c7ccbc59-779tb" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83") : secret "image-registry-tls" not found Apr 24 22:32:16.224303 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:16.224267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" event={"ID":"e0595875-db43-466a-aa45-15f3138253c4","Type":"ContainerStarted","Data":"0a442ca164e8a7518e873dc97a423e7622f14b07a22048cd06858da5fafa146c"} Apr 24 22:32:16.225415 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:16.225383 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" event={"ID":"2ae54e6c-a291-4b2e-8885-ba0d08f9048c","Type":"ContainerStarted","Data":"deaf66ae7a6729b30cbbda1134520ac83f6592cb83db3b635aaa3f09a6320059"} Apr 24 22:32:16.671090 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:16.671051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:16.671262 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:16.671215 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:32:16.671326 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:16.671290 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls podName:7c99265f-97ef-4683-8d9c-f7c17dd3f1ec nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.671274183 +0000 UTC m=+148.394866735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zq5vs" (UID: "7c99265f-97ef-4683-8d9c-f7c17dd3f1ec") : secret "samples-operator-tls" not found Apr 24 22:32:16.772014 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:16.771971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:16.772196 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:16.772127 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:16.772196 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:16.772154 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c7ccbc59-779tb: secret "image-registry-tls" not found Apr 24 22:32:16.772283 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:16.772219 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls podName:3150c082-9c90-4dd3-ad49-9ddb41172e83 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.772199323 +0000 UTC m=+148.495791888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls") pod "image-registry-c7ccbc59-779tb" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83") : secret "image-registry-tls" not found Apr 24 22:32:18.230547 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.230518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/0.log" Apr 24 22:32:18.230996 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.230560 2574 generic.go:358] "Generic (PLEG): container finished" podID="2ae54e6c-a291-4b2e-8885-ba0d08f9048c" containerID="d90aa10f4504857b05133d7a3b77399e5213b4215de0a3fc8e438d9e04c1e6f5" exitCode=255 Apr 24 22:32:18.230996 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.230644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" event={"ID":"2ae54e6c-a291-4b2e-8885-ba0d08f9048c","Type":"ContainerDied","Data":"d90aa10f4504857b05133d7a3b77399e5213b4215de0a3fc8e438d9e04c1e6f5"} Apr 24 22:32:18.230996 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.230884 2574 scope.go:117] "RemoveContainer" containerID="d90aa10f4504857b05133d7a3b77399e5213b4215de0a3fc8e438d9e04c1e6f5" Apr 24 22:32:18.231874 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.231854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" event={"ID":"e0595875-db43-466a-aa45-15f3138253c4","Type":"ContainerStarted","Data":"22571ce282aef417bb42d60157e895c2d6ed6345e3829e28da29299f891e39ab"} Apr 24 22:32:18.262428 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.262374 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" podStartSLOduration=1.966909732 podStartE2EDuration="4.262357715s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:15.337571425 +0000 UTC m=+145.061163973" lastFinishedPulling="2026-04-24 22:32:17.633019408 +0000 UTC m=+147.356611956" observedRunningTime="2026-04-24 22:32:18.262052932 +0000 UTC m=+147.985645503" watchObservedRunningTime="2026-04-24 22:32:18.262357715 +0000 UTC m=+147.985950288" Apr 24 22:32:18.689554 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.689517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:18.689732 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:18.689624 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:32:18.689732 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:18.689685 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls podName:7c99265f-97ef-4683-8d9c-f7c17dd3f1ec nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.689670729 +0000 UTC m=+152.413263278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zq5vs" (UID: "7c99265f-97ef-4683-8d9c-f7c17dd3f1ec") : secret "samples-operator-tls" not found Apr 24 22:32:18.791045 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:18.790994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:18.791223 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:18.791107 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:18.791223 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:18.791126 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c7ccbc59-779tb: secret "image-registry-tls" not found Apr 24 22:32:18.791223 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:18.791182 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls podName:3150c082-9c90-4dd3-ad49-9ddb41172e83 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.791165075 +0000 UTC m=+152.514757640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls") pod "image-registry-c7ccbc59-779tb" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83") : secret "image-registry-tls" not found Apr 24 22:32:19.104174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.104144 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92"] Apr 24 22:32:19.108043 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.108020 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" Apr 24 22:32:19.110421 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.110398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:19.110538 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.110510 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-g5jpw\"" Apr 24 22:32:19.111490 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.111475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 22:32:19.115145 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.115127 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92"] Apr 24 22:32:19.193659 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.193623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wht\" (UniqueName: \"kubernetes.io/projected/63b4d91d-1df3-49ed-9d57-7a0ec28ca165-kube-api-access-w4wht\") pod \"migrator-74bb7799d9-scg92\" (UID: \"63b4d91d-1df3-49ed-9d57-7a0ec28ca165\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" Apr 24 22:32:19.234805 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.234781 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:32:19.235155 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.235137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/0.log" Apr 24 22:32:19.235203 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.235166 2574 generic.go:358] "Generic (PLEG): container finished" podID="2ae54e6c-a291-4b2e-8885-ba0d08f9048c" containerID="fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f" exitCode=255 Apr 24 22:32:19.235283 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.235262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" event={"ID":"2ae54e6c-a291-4b2e-8885-ba0d08f9048c","Type":"ContainerDied","Data":"fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f"} Apr 24 22:32:19.235338 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.235311 2574 scope.go:117] "RemoveContainer" containerID="d90aa10f4504857b05133d7a3b77399e5213b4215de0a3fc8e438d9e04c1e6f5" Apr 24 22:32:19.235501 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.235482 2574 scope.go:117] "RemoveContainer" containerID="fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f" Apr 24 22:32:19.235737 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:19.235714 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6z9wv_openshift-console-operator(2ae54e6c-a291-4b2e-8885-ba0d08f9048c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" podUID="2ae54e6c-a291-4b2e-8885-ba0d08f9048c" Apr 24 22:32:19.294820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.294794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wht\" (UniqueName: \"kubernetes.io/projected/63b4d91d-1df3-49ed-9d57-7a0ec28ca165-kube-api-access-w4wht\") pod \"migrator-74bb7799d9-scg92\" (UID: \"63b4d91d-1df3-49ed-9d57-7a0ec28ca165\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" Apr 24 22:32:19.303999 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.303955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wht\" (UniqueName: \"kubernetes.io/projected/63b4d91d-1df3-49ed-9d57-7a0ec28ca165-kube-api-access-w4wht\") pod \"migrator-74bb7799d9-scg92\" (UID: \"63b4d91d-1df3-49ed-9d57-7a0ec28ca165\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" Apr 24 22:32:19.417276 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.417206 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" Apr 24 22:32:19.529645 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:19.529610 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92"] Apr 24 22:32:19.533143 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:19.533114 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b4d91d_1df3_49ed_9d57_7a0ec28ca165.slice/crio-af07878cbce2536fccd46cf69d7fd9b0196ead686425bf825b29df581ee5cc59 WatchSource:0}: Error finding container af07878cbce2536fccd46cf69d7fd9b0196ead686425bf825b29df581ee5cc59: Status 404 returned error can't find the container with id af07878cbce2536fccd46cf69d7fd9b0196ead686425bf825b29df581ee5cc59 Apr 24 22:32:20.238520 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:20.238495 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:32:20.239098 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:20.238900 2574 scope.go:117] "RemoveContainer" containerID="fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f" Apr 24 22:32:20.239172 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:20.239154 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6z9wv_openshift-console-operator(2ae54e6c-a291-4b2e-8885-ba0d08f9048c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" podUID="2ae54e6c-a291-4b2e-8885-ba0d08f9048c" Apr 24 22:32:20.239772 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:20.239749 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" event={"ID":"63b4d91d-1df3-49ed-9d57-7a0ec28ca165","Type":"ContainerStarted","Data":"af07878cbce2536fccd46cf69d7fd9b0196ead686425bf825b29df581ee5cc59"} Apr 24 22:32:21.245723 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:21.245685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" event={"ID":"63b4d91d-1df3-49ed-9d57-7a0ec28ca165","Type":"ContainerStarted","Data":"c6eca4c8f6ced832f3c740017bf5da2c56e421f8a36e51c66130f0a6d0a964b7"} Apr 24 22:32:21.245723 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:21.245725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" event={"ID":"63b4d91d-1df3-49ed-9d57-7a0ec28ca165","Type":"ContainerStarted","Data":"3909e4d6a8ff127c05d50c3da0de56638a89f16e3a1cf7bd747c632850b649af"} Apr 24 22:32:21.262242 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:21.262199 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg92" podStartSLOduration=1.174155162 podStartE2EDuration="2.262185755s" podCreationTimestamp="2026-04-24 22:32:19 +0000 UTC" firstStartedPulling="2026-04-24 22:32:19.534885473 +0000 UTC m=+149.258478020" lastFinishedPulling="2026-04-24 22:32:20.622916047 +0000 UTC m=+150.346508613" observedRunningTime="2026-04-24 22:32:21.261938843 +0000 UTC m=+150.985531413" watchObservedRunningTime="2026-04-24 22:32:21.262185755 +0000 UTC m=+150.985778322" Apr 24 22:32:22.159132 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:22.159103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-knznw_8acc4f4f-831f-4c10-a187-01230734276e/dns-node-resolver/0.log" Apr 24 22:32:22.720432 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:22.720399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:22.720789 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:22.720513 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:32:22.720789 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:22.720575 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls podName:7c99265f-97ef-4683-8d9c-f7c17dd3f1ec nodeName:}" failed. No retries permitted until 2026-04-24 22:32:30.720560557 +0000 UTC m=+160.444153105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zq5vs" (UID: "7c99265f-97ef-4683-8d9c-f7c17dd3f1ec") : secret "samples-operator-tls" not found Apr 24 22:32:22.821294 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:22.821259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:22.821449 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:22.821404 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:22.821449 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:22.821424 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c7ccbc59-779tb: secret "image-registry-tls" not found Apr 24 22:32:22.821536 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:22.821487 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls podName:3150c082-9c90-4dd3-ad49-9ddb41172e83 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:30.821471552 +0000 UTC m=+160.545064100 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls") pod "image-registry-c7ccbc59-779tb" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83") : secret "image-registry-tls" not found Apr 24 22:32:23.358561 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:23.358535 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xx2md_939c10b8-9c56-4502-a65a-30206c40fa9d/node-ca/0.log" Apr 24 22:32:24.368767 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:24.368741 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg92_63b4d91d-1df3-49ed-9d57-7a0ec28ca165/migrator/0.log" Apr 24 22:32:24.560666 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:24.560638 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg92_63b4d91d-1df3-49ed-9d57-7a0ec28ca165/graceful-termination/0.log" Apr 24 22:32:25.214461 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:25.214422 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:25.214461 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:25.214467 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:25.214885 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:25.214866 2574 scope.go:117] "RemoveContainer" containerID="fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f" Apr 24 22:32:25.215109 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:25.215089 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6z9wv_openshift-console-operator(2ae54e6c-a291-4b2e-8885-ba0d08f9048c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" podUID="2ae54e6c-a291-4b2e-8885-ba0d08f9048c" Apr 24 22:32:26.722950 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:26.722904 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gt62l" podUID="8a25583f-2bb0-4f33-93bd-3e30aec48cee" Apr 24 22:32:26.739457 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:26.739419 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vffmw" podUID="8eff2e0b-1c86-4882-8b55-3fb02eb38ebe" Apr 24 22:32:26.865741 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:26.865706 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8ztg8" podUID="d0c4cf71-fe26-4a65-bc22-b98bb5827d73" Apr 24 22:32:27.259687 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:27.259659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:30.787860 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.787817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:30.790116 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.790085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c99265f-97ef-4683-8d9c-f7c17dd3f1ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zq5vs\" (UID: \"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:30.818142 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.818111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" Apr 24 22:32:30.889488 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.889220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:30.892859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.892830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"image-registry-c7ccbc59-779tb\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:30.900733 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.900706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:30.934913 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:30.934880 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs"] Apr 24 22:32:31.023785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.023752 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:32:31.027443 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:31.027413 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3150c082_9c90_4dd3_ad49_9ddb41172e83.slice/crio-1af15764f4c8eab78b56920c237b7cf390e2e27725a826be3148b8fdd1247efa WatchSource:0}: Error finding container 1af15764f4c8eab78b56920c237b7cf390e2e27725a826be3148b8fdd1247efa: Status 404 returned error can't find the container with id 1af15764f4c8eab78b56920c237b7cf390e2e27725a826be3148b8fdd1247efa Apr 24 22:32:31.268607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.268563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" event={"ID":"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec","Type":"ContainerStarted","Data":"edc2222efcccf7294ced08cd7c8e4702a757e2fb0a1417a911017463dd56cd3c"} Apr 24 22:32:31.269847 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.269811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" event={"ID":"3150c082-9c90-4dd3-ad49-9ddb41172e83","Type":"ContainerStarted","Data":"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3"} Apr 24 22:32:31.269847 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.269845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" event={"ID":"3150c082-9c90-4dd3-ad49-9ddb41172e83","Type":"ContainerStarted","Data":"1af15764f4c8eab78b56920c237b7cf390e2e27725a826be3148b8fdd1247efa"} Apr 24 22:32:31.270003 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.269988 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:31.296937 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.296830 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" podStartSLOduration=17.296812484 podStartE2EDuration="17.296812484s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:31.295768702 +0000 UTC m=+161.019361273" watchObservedRunningTime="2026-04-24 22:32:31.296812484 +0000 UTC m=+161.020405056" Apr 24 22:32:31.596165 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.596079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:32:31.596165 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.596163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:31.598915 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.598869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a25583f-2bb0-4f33-93bd-3e30aec48cee-metrics-tls\") pod \"dns-default-gt62l\" (UID: \"8a25583f-2bb0-4f33-93bd-3e30aec48cee\") " pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:31.599098 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.599076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8eff2e0b-1c86-4882-8b55-3fb02eb38ebe-cert\") pod \"ingress-canary-vffmw\" (UID: \"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe\") " pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:32:31.763071 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.763038 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:32:31.771219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.771191 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:31.911238 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:31.911205 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gt62l"] Apr 24 22:32:31.915410 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:31.915383 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a25583f_2bb0_4f33_93bd_3e30aec48cee.slice/crio-d1ea4e08ef8f682992a611a7a7d2671dadf6545d3e56f64ce38cd6ec1cbd8025 WatchSource:0}: Error finding container d1ea4e08ef8f682992a611a7a7d2671dadf6545d3e56f64ce38cd6ec1cbd8025: Status 404 returned error can't find the container with id d1ea4e08ef8f682992a611a7a7d2671dadf6545d3e56f64ce38cd6ec1cbd8025 Apr 24 22:32:32.273407 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:32.273318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt62l" event={"ID":"8a25583f-2bb0-4f33-93bd-3e30aec48cee","Type":"ContainerStarted","Data":"d1ea4e08ef8f682992a611a7a7d2671dadf6545d3e56f64ce38cd6ec1cbd8025"} Apr 24 22:32:33.277955 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:33.277913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" event={"ID":"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec","Type":"ContainerStarted","Data":"40bfbab598f800e56604e783bd08a03f231b0f43a9785463d46ca27b6513456c"} Apr 24 22:32:33.277955 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:33.277968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" event={"ID":"7c99265f-97ef-4683-8d9c-f7c17dd3f1ec","Type":"ContainerStarted","Data":"c463d0d748caf3146a7f982d0b4f1e1c80cb067d8007e382abe984410788fcc5"} Apr 24 22:32:33.294918 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:33.294858 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zq5vs" podStartSLOduration=17.672311639 podStartE2EDuration="19.294841972s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:30.985777092 +0000 UTC m=+160.709369641" lastFinishedPulling="2026-04-24 22:32:32.608307426 +0000 UTC m=+162.331899974" observedRunningTime="2026-04-24 22:32:33.294074441 +0000 UTC m=+163.017667013" watchObservedRunningTime="2026-04-24 22:32:33.294841972 +0000 UTC m=+163.018434542" Apr 24 22:32:34.284234 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:34.284196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt62l" event={"ID":"8a25583f-2bb0-4f33-93bd-3e30aec48cee","Type":"ContainerStarted","Data":"e2849f35b1cc799ec07d696760b9d15730088c83685082f4b93995acfdbce8e6"} Apr 24 22:32:34.284626 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:34.284248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt62l" event={"ID":"8a25583f-2bb0-4f33-93bd-3e30aec48cee","Type":"ContainerStarted","Data":"b2c3231af64ae61f913374219a98805c43849ebb604a9cc0064c6c27bed6b5c4"} Apr 24 22:32:34.284626 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:34.284352 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:34.309728 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:34.309677 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gt62l" podStartSLOduration=129.640395107 podStartE2EDuration="2m11.30966588s" podCreationTimestamp="2026-04-24 22:30:23 +0000 UTC" firstStartedPulling="2026-04-24 22:32:31.917462648 +0000 UTC m=+161.641055203" lastFinishedPulling="2026-04-24 22:32:33.586733429 +0000 UTC m=+163.310325976" observedRunningTime="2026-04-24 22:32:34.308897932 +0000 UTC m=+164.032490515" watchObservedRunningTime="2026-04-24 22:32:34.30966588 +0000 UTC m=+164.033258449" Apr 24 22:32:35.842267 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:35.842235 2574 scope.go:117] "RemoveContainer" containerID="fa05f8e7bb74b057ecea2a6ebaad75861ee06c9d0c235d89bc7b0e505847453f" Apr 24 22:32:36.291129 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:36.291101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:32:36.291366 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:36.291165 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" event={"ID":"2ae54e6c-a291-4b2e-8885-ba0d08f9048c","Type":"ContainerStarted","Data":"89bc2fdd53fa2febaa570faef47d5c0ee8ee7cea26275c858f6db8414d150e48"} Apr 24 22:32:36.291451 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:36.291433 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:36.517704 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:36.517677 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" Apr 24 22:32:36.536165 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:36.536104 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-6z9wv" podStartSLOduration=20.258236776 podStartE2EDuration="22.536086591s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:15.353086557 +0000 UTC m=+145.076679108" lastFinishedPulling="2026-04-24 22:32:17.630936373 +0000 UTC m=+147.354528923" observedRunningTime="2026-04-24 22:32:36.312776072 +0000 UTC m=+166.036368652" watchObservedRunningTime="2026-04-24 22:32:36.536086591 +0000 UTC m=+166.259679164" Apr 24 22:32:39.842808 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:39.842766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:32:40.843823 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:40.843793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:32:40.846537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:40.846516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:32:40.854795 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:40.854769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vffmw" Apr 24 22:32:40.965438 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:40.965411 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vffmw"] Apr 24 22:32:40.968417 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:40.968390 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eff2e0b_1c86_4882_8b55_3fb02eb38ebe.slice/crio-2a2258d3a885ccced3dae41bc99e40f0bdb87668060ebbc3bf35f1fa40ad3123 WatchSource:0}: Error finding container 2a2258d3a885ccced3dae41bc99e40f0bdb87668060ebbc3bf35f1fa40ad3123: Status 404 returned error can't find the container with id 2a2258d3a885ccced3dae41bc99e40f0bdb87668060ebbc3bf35f1fa40ad3123 Apr 24 22:32:41.307510 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.307474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vffmw" event={"ID":"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe","Type":"ContainerStarted","Data":"2a2258d3a885ccced3dae41bc99e40f0bdb87668060ebbc3bf35f1fa40ad3123"} Apr 24 22:32:41.825702 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.825667 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qmpbb"] Apr 24 22:32:41.830035 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.829980 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.832551 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.832522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:41.832670 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.832556 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:41.833680 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.833658 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f6lt5\"" Apr 24 22:32:41.840386 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.840356 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qmpbb"] Apr 24 22:32:41.848584 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.848563 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:32:41.859502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.859225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31f36569-beca-46b9-b9fd-88f0cc0404d1-data-volume\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.859502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.859264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31f36569-beca-46b9-b9fd-88f0cc0404d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.859502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.859305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31f36569-beca-46b9-b9fd-88f0cc0404d1-crio-socket\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.859502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.859354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.859502 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.859378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th99p\" (UniqueName: \"kubernetes.io/projected/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-api-access-th99p\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.862245 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.862222 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hcwpm"] Apr 24 22:32:41.867113 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.867089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:32:41.870410 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.869909 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:32:41.870410 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.870190 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sb99h\"" Apr 24 22:32:41.870915 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.870752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:32:41.878827 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.878539 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hcwpm"] Apr 24 22:32:41.959974 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.959928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31f36569-beca-46b9-b9fd-88f0cc0404d1-data-volume\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.959974 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.959977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31f36569-beca-46b9-b9fd-88f0cc0404d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqcdb\" (UniqueName: \"kubernetes.io/projected/584c5d46-6f2a-4b43-a0d2-132c52dd9409-kube-api-access-vqcdb\") pod \"downloads-6bcc868b7-hcwpm\" (UID: \"584c5d46-6f2a-4b43-a0d2-132c52dd9409\") " pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:32:41.960211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31f36569-beca-46b9-b9fd-88f0cc0404d1-crio-socket\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th99p\" (UniqueName: \"kubernetes.io/projected/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-api-access-th99p\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31f36569-beca-46b9-b9fd-88f0cc0404d1-crio-socket\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31f36569-beca-46b9-b9fd-88f0cc0404d1-data-volume\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.960646 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.960626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.962338 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.962312 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31f36569-beca-46b9-b9fd-88f0cc0404d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:41.970300 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:41.970279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th99p\" (UniqueName: \"kubernetes.io/projected/31f36569-beca-46b9-b9fd-88f0cc0404d1-kube-api-access-th99p\") pod \"insights-runtime-extractor-qmpbb\" (UID: \"31f36569-beca-46b9-b9fd-88f0cc0404d1\") " pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:42.061400 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.061366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqcdb\" (UniqueName: \"kubernetes.io/projected/584c5d46-6f2a-4b43-a0d2-132c52dd9409-kube-api-access-vqcdb\") pod \"downloads-6bcc868b7-hcwpm\" (UID: \"584c5d46-6f2a-4b43-a0d2-132c52dd9409\") " pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:32:42.072108 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.072077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqcdb\" (UniqueName: \"kubernetes.io/projected/584c5d46-6f2a-4b43-a0d2-132c52dd9409-kube-api-access-vqcdb\") pod \"downloads-6bcc868b7-hcwpm\" (UID: \"584c5d46-6f2a-4b43-a0d2-132c52dd9409\") " pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:32:42.147129 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.147094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qmpbb" Apr 24 22:32:42.180129 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.180099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:32:42.536330 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.536302 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qmpbb"] Apr 24 22:32:42.540863 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:42.540838 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f36569_beca_46b9_b9fd_88f0cc0404d1.slice/crio-7dd23d6aad4938aaa7961437a924d0fb1668749dc9b63c5d1fe903b2a8264a2e WatchSource:0}: Error finding container 7dd23d6aad4938aaa7961437a924d0fb1668749dc9b63c5d1fe903b2a8264a2e: Status 404 returned error can't find the container with id 7dd23d6aad4938aaa7961437a924d0fb1668749dc9b63c5d1fe903b2a8264a2e Apr 24 22:32:42.552116 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:42.552090 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hcwpm"] Apr 24 22:32:42.557830 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:42.557767 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584c5d46_6f2a_4b43_a0d2_132c52dd9409.slice/crio-317315d51ee28490f3efbec533f2d97d11a2124a5860d9f7318fcabfae3a19ae WatchSource:0}: Error finding container 317315d51ee28490f3efbec533f2d97d11a2124a5860d9f7318fcabfae3a19ae: Status 404 returned error can't find the container with id 317315d51ee28490f3efbec533f2d97d11a2124a5860d9f7318fcabfae3a19ae Apr 24 22:32:43.314275 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.314235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vffmw" event={"ID":"8eff2e0b-1c86-4882-8b55-3fb02eb38ebe","Type":"ContainerStarted","Data":"272fb93dad59c19492c0968c6375690ebaf1b4866c297ba22f9038ecf68c489a"} Apr 24 22:32:43.315988 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.315944 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qmpbb" event={"ID":"31f36569-beca-46b9-b9fd-88f0cc0404d1","Type":"ContainerStarted","Data":"76eff5cf8ca459ac2a5014f59defeb38b8be6475705d60690ab6583019e79ceb"} Apr 24 22:32:43.316127 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.315994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qmpbb" event={"ID":"31f36569-beca-46b9-b9fd-88f0cc0404d1","Type":"ContainerStarted","Data":"1209fbb0762fa6e86e061b21f107b6919ae9334989b5c92b0ba577da6a4ba3f8"} Apr 24 22:32:43.316127 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.316006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qmpbb" event={"ID":"31f36569-beca-46b9-b9fd-88f0cc0404d1","Type":"ContainerStarted","Data":"7dd23d6aad4938aaa7961437a924d0fb1668749dc9b63c5d1fe903b2a8264a2e"} Apr 24 22:32:43.317272 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.317252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hcwpm" event={"ID":"584c5d46-6f2a-4b43-a0d2-132c52dd9409","Type":"ContainerStarted","Data":"317315d51ee28490f3efbec533f2d97d11a2124a5860d9f7318fcabfae3a19ae"} Apr 24 22:32:43.330664 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:43.330610 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vffmw" podStartSLOduration=138.839638354 podStartE2EDuration="2m20.330593763s" podCreationTimestamp="2026-04-24 22:30:23 +0000 UTC" firstStartedPulling="2026-04-24 22:32:40.970256315 +0000 UTC m=+170.693848866" lastFinishedPulling="2026-04-24 22:32:42.461211715 +0000 UTC m=+172.184804275" observedRunningTime="2026-04-24 22:32:43.329726513 +0000 UTC m=+173.053319084" watchObservedRunningTime="2026-04-24 22:32:43.330593763 +0000 UTC m=+173.054186337" Apr 24 22:32:44.290650 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:44.290610 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gt62l" Apr 24 22:32:45.326937 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:45.326892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qmpbb" event={"ID":"31f36569-beca-46b9-b9fd-88f0cc0404d1","Type":"ContainerStarted","Data":"bafc42da89f3275e342d2b60c5cdd1ec20c72a59cc98126aa09617a16724dfd3"} Apr 24 22:32:45.344431 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:45.344374 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qmpbb" podStartSLOduration=2.356449078 podStartE2EDuration="4.344356487s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="2026-04-24 22:32:42.599916428 +0000 UTC m=+172.323508979" lastFinishedPulling="2026-04-24 22:32:44.587823841 +0000 UTC m=+174.311416388" observedRunningTime="2026-04-24 22:32:45.343023167 +0000 UTC m=+175.066615736" watchObservedRunningTime="2026-04-24 22:32:45.344356487 +0000 UTC m=+175.067949053" Apr 24 22:32:46.022018 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.021990 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:32:46.024916 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.024889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.028621 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.028563 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 22:32:46.028621 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.028599 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 22:32:46.028823 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.028795 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 22:32:46.028888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.028832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-glb98\"" Apr 24 22:32:46.028943 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.028928 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 22:32:46.029075 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.029061 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 22:32:46.029859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.029840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:32:46.092300 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.092300 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.092532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.092532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg86n\" (UniqueName: \"kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.092532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.092532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.092500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193645 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193815 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193815 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193918 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg86n\" (UniqueName: \"kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193918 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.193918 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.193901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.194507 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.194477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.194837 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.194818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.195017 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.194995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.196716 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.196692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.196809 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.196729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.201755 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.201733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg86n\" (UniqueName: \"kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n\") pod \"console-797d95865c-xk2x7\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.337702 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.337624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:46.471422 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:46.471391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:32:46.474568 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:46.474528 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a49298_1d9a_4ed5_ae8c_f018ab319d9e.slice/crio-28d60490dab01a92f77a97b3d83ca28043de8b93a39e0ec30f7e842c2a81b7f6 WatchSource:0}: Error finding container 28d60490dab01a92f77a97b3d83ca28043de8b93a39e0ec30f7e842c2a81b7f6: Status 404 returned error can't find the container with id 28d60490dab01a92f77a97b3d83ca28043de8b93a39e0ec30f7e842c2a81b7f6 Apr 24 22:32:47.333479 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:47.333435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797d95865c-xk2x7" event={"ID":"67a49298-1d9a-4ed5-ae8c-f018ab319d9e","Type":"ContainerStarted","Data":"28d60490dab01a92f77a97b3d83ca28043de8b93a39e0ec30f7e842c2a81b7f6"} Apr 24 22:32:50.344493 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:50.344451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797d95865c-xk2x7" event={"ID":"67a49298-1d9a-4ed5-ae8c-f018ab319d9e","Type":"ContainerStarted","Data":"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c"} Apr 24 22:32:50.363264 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:50.363210 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-797d95865c-xk2x7" podStartSLOduration=1.238793274 podStartE2EDuration="4.363190733s" podCreationTimestamp="2026-04-24 22:32:46 +0000 UTC" firstStartedPulling="2026-04-24 22:32:46.476721786 +0000 UTC m=+176.200314335" lastFinishedPulling="2026-04-24 22:32:49.601119243 +0000 UTC m=+179.324711794" observedRunningTime="2026-04-24 22:32:50.362366909 +0000 UTC m=+180.085959481" watchObservedRunningTime="2026-04-24 22:32:50.363190733 +0000 UTC m=+180.086783304" Apr 24 22:32:51.853902 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:51.853873 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:32:52.167018 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.166923 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gdzpg"] Apr 24 22:32:52.171305 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.171197 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.174394 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.174369 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:32:52.175399 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.175377 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 22:32:52.175642 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.175626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:32:52.175851 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.175835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qk2r2\"" Apr 24 22:32:52.175926 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.175913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 22:32:52.176298 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.176195 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:32:52.178284 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.178245 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gdzpg"] Apr 24 22:32:52.244912 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.244877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.245104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.244931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e17516-d534-43af-9540-9a890f73e5f2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.245104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.245025 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh49x\" (UniqueName: \"kubernetes.io/projected/c9e17516-d534-43af-9540-9a890f73e5f2-kube-api-access-hh49x\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.245104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.245095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.345866 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.345829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.346085 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.345898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e17516-d534-43af-9540-9a890f73e5f2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.346085 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.345936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh49x\" (UniqueName: \"kubernetes.io/projected/c9e17516-d534-43af-9540-9a890f73e5f2-kube-api-access-hh49x\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.346085 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.345990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.346085 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:52.346013 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 22:32:52.346294 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:32:52.346090 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls podName:c9e17516-d534-43af-9540-9a890f73e5f2 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:52.846070321 +0000 UTC m=+182.569662882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gdzpg" (UID: "c9e17516-d534-43af-9540-9a890f73e5f2") : secret "prometheus-operator-tls" not found Apr 24 22:32:52.346737 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.346709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e17516-d534-43af-9540-9a890f73e5f2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.348740 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.348714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.356772 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.356745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh49x\" (UniqueName: \"kubernetes.io/projected/c9e17516-d534-43af-9540-9a890f73e5f2-kube-api-access-hh49x\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.851123 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.851088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:52.853824 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:52.853799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9e17516-d534-43af-9540-9a890f73e5f2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gdzpg\" (UID: \"c9e17516-d534-43af-9540-9a890f73e5f2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:53.083673 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:53.083636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" Apr 24 22:32:55.411016 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.410984 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:32:55.415397 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.415374 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.423428 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.423393 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 22:32:55.431641 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.431613 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:32:55.474164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474323 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wg9d\" (UniqueName: \"kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474323 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474323 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474323 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474488 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.474488 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.474345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575089 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wg9d\" (UniqueName: \"kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575089 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575304 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575304 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575392 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575392 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.575476 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.575396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.576031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.576002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.576247 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.576225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.576403 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.576333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.576698 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.576675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.577861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.577839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.578023 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.578006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.584088 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.584066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wg9d\" (UniqueName: \"kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d\") pod \"console-7c9d9ddd69-hxpjz\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:55.726115 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:55.726029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:32:56.338251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:56.338211 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:56.338251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:56.338259 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:56.344424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:56.344398 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:56.368785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:56.368756 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:32:58.348174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:58.348152 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:32:58.351354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:58.351310 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gdzpg"] Apr 24 22:32:58.352131 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:58.352039 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815bae00_356b_484d_9532_37027c9aae29.slice/crio-a03e971326b5c751838f4408c11b41160b704ba739bb7811ba39a1ae0acab14e WatchSource:0}: Error finding container a03e971326b5c751838f4408c11b41160b704ba739bb7811ba39a1ae0acab14e: Status 404 returned error can't find the container with id a03e971326b5c751838f4408c11b41160b704ba739bb7811ba39a1ae0acab14e Apr 24 22:32:58.354361 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:32:58.354334 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e17516_d534_43af_9540_9a890f73e5f2.slice/crio-58bcb865c58e2428456da04731be6fcdf02b9aee8327e2e813e3e9578b59c47b WatchSource:0}: Error finding container 58bcb865c58e2428456da04731be6fcdf02b9aee8327e2e813e3e9578b59c47b: Status 404 returned error can't find the container with id 58bcb865c58e2428456da04731be6fcdf02b9aee8327e2e813e3e9578b59c47b Apr 24 22:32:58.370712 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:58.370676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hcwpm" event={"ID":"584c5d46-6f2a-4b43-a0d2-132c52dd9409","Type":"ContainerStarted","Data":"bcae164bf9364fd972e8694fb897003f73073b5606848a6d235d9ec0b13ac491"} Apr 24 22:32:58.372063 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:58.372035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" event={"ID":"c9e17516-d534-43af-9540-9a890f73e5f2","Type":"ContainerStarted","Data":"58bcb865c58e2428456da04731be6fcdf02b9aee8327e2e813e3e9578b59c47b"} Apr 24 22:32:58.373158 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:58.373133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c9d9ddd69-hxpjz" event={"ID":"815bae00-356b-484d-9532-37027c9aae29","Type":"ContainerStarted","Data":"a03e971326b5c751838f4408c11b41160b704ba739bb7811ba39a1ae0acab14e"} Apr 24 22:32:59.377633 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:59.377592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c9d9ddd69-hxpjz" event={"ID":"815bae00-356b-484d-9532-37027c9aae29","Type":"ContainerStarted","Data":"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515"} Apr 24 22:32:59.397173 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:59.397101 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c9d9ddd69-hxpjz" podStartSLOduration=4.39708458 podStartE2EDuration="4.39708458s" podCreationTimestamp="2026-04-24 22:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:59.395882232 +0000 UTC m=+189.119474803" watchObservedRunningTime="2026-04-24 22:32:59.39708458 +0000 UTC m=+189.120677151" Apr 24 22:32:59.414105 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:32:59.414018 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hcwpm" podStartSLOduration=2.680724324 podStartE2EDuration="18.414001342s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="2026-04-24 22:32:42.56078759 +0000 UTC m=+172.284380138" lastFinishedPulling="2026-04-24 22:32:58.294064594 +0000 UTC m=+188.017657156" observedRunningTime="2026-04-24 22:32:59.413171936 +0000 UTC m=+189.136764508" watchObservedRunningTime="2026-04-24 22:32:59.414001342 +0000 UTC m=+189.137593911" Apr 24 22:33:00.382157 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:00.382118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" event={"ID":"c9e17516-d534-43af-9540-9a890f73e5f2","Type":"ContainerStarted","Data":"113039104d54daa833ff9a62aba7c36b48515361d6cf897e03f7a45553c1f4d3"} Apr 24 22:33:00.382557 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:00.382163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" event={"ID":"c9e17516-d534-43af-9540-9a890f73e5f2","Type":"ContainerStarted","Data":"362933d992c43e064b6de0f9923cb568e8da508a2d486e71118421eb7e02cee6"} Apr 24 22:33:00.400407 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:00.400358 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gdzpg" podStartSLOduration=7.08696932 podStartE2EDuration="8.400345506s" podCreationTimestamp="2026-04-24 22:32:52 +0000 UTC" firstStartedPulling="2026-04-24 22:32:58.356310442 +0000 UTC m=+188.079903004" lastFinishedPulling="2026-04-24 22:32:59.669686628 +0000 UTC m=+189.393279190" observedRunningTime="2026-04-24 22:33:00.399008736 +0000 UTC m=+190.122601307" watchObservedRunningTime="2026-04-24 22:33:00.400345506 +0000 UTC m=+190.123938076" Apr 24 22:33:02.534352 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.534312 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc"] Apr 24 22:33:02.547733 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.547699 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc"] Apr 24 22:33:02.547894 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.547871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.550499 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.550468 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 22:33:02.550644 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.550514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:33:02.550868 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.550852 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-krxrp\"" Apr 24 22:33:02.564049 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.564024 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j2lwr"] Apr 24 22:33:02.574579 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.574557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.577172 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.577148 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:33:02.577442 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.577405 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:33:02.577593 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.577574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6nxsq\"" Apr 24 22:33:02.578030 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.578009 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:33:02.642406 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-tls\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-root\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-textfile\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-wtmp\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9p7\" (UniqueName: \"kubernetes.io/projected/96dd1a1c-c821-4172-b742-661cf436945c-kube-api-access-dk9p7\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.642607 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-metrics-client-ca\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.642852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.642852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.642852 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsc2t\" (UniqueName: \"kubernetes.io/projected/c85adc68-9477-4b3e-b89a-43fb8defbdbb-kube-api-access-tsc2t\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.643097 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-sys\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.643097 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.642882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96dd1a1c-c821-4172-b742-661cf436945c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.744341 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsc2t\" (UniqueName: \"kubernetes.io/projected/c85adc68-9477-4b3e-b89a-43fb8defbdbb-kube-api-access-tsc2t\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-sys\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96dd1a1c-c821-4172-b742-661cf436945c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-tls\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-sys\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-root\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.744598 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-root\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-textfile\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-wtmp\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9p7\" (UniqueName: \"kubernetes.io/projected/96dd1a1c-c821-4172-b742-661cf436945c-kube-api-access-dk9p7\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-metrics-client-ca\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745053 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.744701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.745135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-textfile\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.745207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-wtmp\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.745237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.745639 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.745621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85adc68-9477-4b3e-b89a-43fb8defbdbb-metrics-client-ca\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.747137 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.747110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.747370 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.747352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96dd1a1c-c821-4172-b742-661cf436945c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.750995 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.750942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-tls\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.751096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.751080 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85adc68-9477-4b3e-b89a-43fb8defbdbb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.753843 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.753819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96dd1a1c-c821-4172-b742-661cf436945c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.763612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.763591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsc2t\" (UniqueName: \"kubernetes.io/projected/c85adc68-9477-4b3e-b89a-43fb8defbdbb-kube-api-access-tsc2t\") pod \"node-exporter-j2lwr\" (UID: \"c85adc68-9477-4b3e-b89a-43fb8defbdbb\") " pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.763725 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.763683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9p7\" (UniqueName: \"kubernetes.io/projected/96dd1a1c-c821-4172-b742-661cf436945c-kube-api-access-dk9p7\") pod \"openshift-state-metrics-9d44df66c-7p8rc\" (UID: \"96dd1a1c-c821-4172-b742-661cf436945c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.858915 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.858883 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" Apr 24 22:33:02.885765 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.885731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j2lwr" Apr 24 22:33:02.897213 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:02.897177 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85adc68_9477_4b3e_b89a_43fb8defbdbb.slice/crio-f500a0118c2291a22d129da8420006710f163653712dd4f8c180b0c83636b9b8 WatchSource:0}: Error finding container f500a0118c2291a22d129da8420006710f163653712dd4f8c180b0c83636b9b8: Status 404 returned error can't find the container with id f500a0118c2291a22d129da8420006710f163653712dd4f8c180b0c83636b9b8 Apr 24 22:33:02.996597 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:02.996542 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc"] Apr 24 22:33:02.999698 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:02.999660 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96dd1a1c_c821_4172_b742_661cf436945c.slice/crio-4566fd54f48e1a0987f293e6d6d811033f26de334e8b3514fbe2fe5740eaea5c WatchSource:0}: Error finding container 4566fd54f48e1a0987f293e6d6d811033f26de334e8b3514fbe2fe5740eaea5c: Status 404 returned error can't find the container with id 4566fd54f48e1a0987f293e6d6d811033f26de334e8b3514fbe2fe5740eaea5c Apr 24 22:33:03.393735 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:03.393637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2lwr" event={"ID":"c85adc68-9477-4b3e-b89a-43fb8defbdbb","Type":"ContainerStarted","Data":"f500a0118c2291a22d129da8420006710f163653712dd4f8c180b0c83636b9b8"} Apr 24 22:33:03.395845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:03.395771 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" event={"ID":"96dd1a1c-c821-4172-b742-661cf436945c","Type":"ContainerStarted","Data":"8b2ec5d2c671a5b1c5be1da55041fd00eb72b95055e20004e7aeef5daabca0fe"} Apr 24 22:33:03.395845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:03.395808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" event={"ID":"96dd1a1c-c821-4172-b742-661cf436945c","Type":"ContainerStarted","Data":"e7b75e45e51d0be7d176f0eac2616c0678b07104589d02e322631b3ef7af3ade"} Apr 24 22:33:03.395845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:03.395824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" event={"ID":"96dd1a1c-c821-4172-b742-661cf436945c","Type":"ContainerStarted","Data":"4566fd54f48e1a0987f293e6d6d811033f26de334e8b3514fbe2fe5740eaea5c"} Apr 24 22:33:04.403637 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:04.403573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2lwr" event={"ID":"c85adc68-9477-4b3e-b89a-43fb8defbdbb","Type":"ContainerStarted","Data":"28cd8d73e7054cdd0787c015751124ef6155631dd4d36f469eee5930470e8719"} Apr 24 22:33:05.408398 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.408363 2574 generic.go:358] "Generic (PLEG): container finished" podID="c85adc68-9477-4b3e-b89a-43fb8defbdbb" containerID="28cd8d73e7054cdd0787c015751124ef6155631dd4d36f469eee5930470e8719" exitCode=0 Apr 24 22:33:05.408891 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.408443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2lwr" event={"ID":"c85adc68-9477-4b3e-b89a-43fb8defbdbb","Type":"ContainerDied","Data":"28cd8d73e7054cdd0787c015751124ef6155631dd4d36f469eee5930470e8719"} Apr 24 22:33:05.410553 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.410512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" event={"ID":"96dd1a1c-c821-4172-b742-661cf436945c","Type":"ContainerStarted","Data":"905dc02635b2249b8e6a7e41d31535e02d66e94a45a83d5e1f36662194230f40"} Apr 24 22:33:05.444529 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.444484 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7p8rc" podStartSLOduration=1.659991129 podStartE2EDuration="3.444470273s" podCreationTimestamp="2026-04-24 22:33:02 +0000 UTC" firstStartedPulling="2026-04-24 22:33:03.224294159 +0000 UTC m=+192.947886709" lastFinishedPulling="2026-04-24 22:33:05.008773288 +0000 UTC m=+194.732365853" observedRunningTime="2026-04-24 22:33:05.443381432 +0000 UTC m=+195.166974002" watchObservedRunningTime="2026-04-24 22:33:05.444470273 +0000 UTC m=+195.168062842" Apr 24 22:33:05.727073 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.727039 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:05.727237 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.727159 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:05.733204 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:05.733180 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:06.416211 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.416172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2lwr" event={"ID":"c85adc68-9477-4b3e-b89a-43fb8defbdbb","Type":"ContainerStarted","Data":"acd3353f88633a53341d23fc2d999a7bd996c3dde84aa0c7b726ef826fcadec8"} Apr 24 22:33:06.416630 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.416219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2lwr" event={"ID":"c85adc68-9477-4b3e-b89a-43fb8defbdbb","Type":"ContainerStarted","Data":"e33ad46ae2cc244da54113c44b807cc7ec84427ca284362425085502f9ebcb13"} Apr 24 22:33:06.420783 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.420760 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:06.439993 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.439922 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j2lwr" podStartSLOduration=3.171758104 podStartE2EDuration="4.439907831s" podCreationTimestamp="2026-04-24 22:33:02 +0000 UTC" firstStartedPulling="2026-04-24 22:33:02.899319846 +0000 UTC m=+192.622912411" lastFinishedPulling="2026-04-24 22:33:04.167469573 +0000 UTC m=+193.891062138" observedRunningTime="2026-04-24 22:33:06.439070789 +0000 UTC m=+196.162663361" watchObservedRunningTime="2026-04-24 22:33:06.439907831 +0000 UTC m=+196.163500401" Apr 24 22:33:06.493861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.493835 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:33:06.872612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.872560 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" podUID="3150c082-9c90-4dd3-ad49-9ddb41172e83" containerName="registry" containerID="cri-o://ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3" gracePeriod=30 Apr 24 22:33:06.978336 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.978308 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8d5c47b75-qtv9m"] Apr 24 22:33:06.998129 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.998097 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8d5c47b75-qtv9m"] Apr 24 22:33:06.998263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:06.998233 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.005031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.004865 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 22:33:07.005031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.004887 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4ff6u8jp28qf2\"" Apr 24 22:33:07.005031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.004905 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 22:33:07.005031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.004910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zr5xz\"" Apr 24 22:33:07.005350 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.005072 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 22:33:07.005350 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.005237 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 22:33:07.083719 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-client-certs\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.083881 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-client-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.083881 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-metrics-server-audit-profiles\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.083881 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.084083 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c25776a2-f338-47e3-a66f-0dbbe5007841-audit-log\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.084083 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.083940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmkn\" (UniqueName: \"kubernetes.io/projected/c25776a2-f338-47e3-a66f-0dbbe5007841-kube-api-access-krmkn\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.084083 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.084012 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-tls\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.137854 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.137831 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:33:07.184441 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c25776a2-f338-47e3-a66f-0dbbe5007841-audit-log\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184441 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krmkn\" (UniqueName: \"kubernetes.io/projected/c25776a2-f338-47e3-a66f-0dbbe5007841-kube-api-access-krmkn\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184677 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-tls\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184677 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-client-certs\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184677 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-client-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184677 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-metrics-server-audit-profiles\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184677 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.184935 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.184847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c25776a2-f338-47e3-a66f-0dbbe5007841-audit-log\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.185485 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.185405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.185805 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.185784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c25776a2-f338-47e3-a66f-0dbbe5007841-metrics-server-audit-profiles\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.187693 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.187668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-client-ca-bundle\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.187693 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.187683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-tls\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.187844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.187757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c25776a2-f338-47e3-a66f-0dbbe5007841-secret-metrics-server-client-certs\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.193452 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.193429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmkn\" (UniqueName: \"kubernetes.io/projected/c25776a2-f338-47e3-a66f-0dbbe5007841-kube-api-access-krmkn\") pod \"metrics-server-8d5c47b75-qtv9m\" (UID: \"c25776a2-f338-47e3-a66f-0dbbe5007841\") " pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.285329 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285289 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.285489 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285343 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.285888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285825 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.285888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285874 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.286107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285910 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.286107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.285948 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.286107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.286016 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.286107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.286077 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca\") pod \"3150c082-9c90-4dd3-ad49-9ddb41172e83\" (UID: \"3150c082-9c90-4dd3-ad49-9ddb41172e83\") " Apr 24 22:33:07.286785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.286564 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:07.286785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.286602 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:07.288172 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.288136 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:07.288760 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.288724 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:07.288870 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.288845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p" (OuterVolumeSpecName: "kube-api-access-p5z7p") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "kube-api-access-p5z7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:07.288946 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.288852 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:07.289172 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.289139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:07.296761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.296731 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3150c082-9c90-4dd3-ad49-9ddb41172e83" (UID: "3150c082-9c90-4dd3-ad49-9ddb41172e83"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:07.310360 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.310334 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:07.338756 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.338729 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd"] Apr 24 22:33:07.339104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.339087 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3150c082-9c90-4dd3-ad49-9ddb41172e83" containerName="registry" Apr 24 22:33:07.339104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.339105 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3150c082-9c90-4dd3-ad49-9ddb41172e83" containerName="registry" Apr 24 22:33:07.339296 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.339177 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3150c082-9c90-4dd3-ad49-9ddb41172e83" containerName="registry" Apr 24 22:33:07.372000 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.371860 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd"] Apr 24 22:33:07.372177 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.372037 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:07.375640 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.375411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 22:33:07.375640 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.375411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-2gd8r\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387253 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-trusted-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387280 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-installation-pull-secrets\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387299 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3150c082-9c90-4dd3-ad49-9ddb41172e83-image-registry-private-configuration\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387318 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-bound-sa-token\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387334 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-kube-api-access-p5z7p\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387349 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387365 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3150c082-9c90-4dd3-ad49-9ddb41172e83-ca-trust-extracted\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.387569 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.387380 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3150c082-9c90-4dd3-ad49-9ddb41172e83-registry-certificates\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:07.429639 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.429566 2574 generic.go:358] "Generic (PLEG): container finished" podID="3150c082-9c90-4dd3-ad49-9ddb41172e83" containerID="ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3" exitCode=0 Apr 24 22:33:07.430029 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.429700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" event={"ID":"3150c082-9c90-4dd3-ad49-9ddb41172e83","Type":"ContainerDied","Data":"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3"} Apr 24 22:33:07.430029 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.429733 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" event={"ID":"3150c082-9c90-4dd3-ad49-9ddb41172e83","Type":"ContainerDied","Data":"1af15764f4c8eab78b56920c237b7cf390e2e27725a826be3148b8fdd1247efa"} Apr 24 22:33:07.430029 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.429752 2574 scope.go:117] "RemoveContainer" containerID="ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3" Apr 24 22:33:07.430517 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.429939 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c7ccbc59-779tb" Apr 24 22:33:07.444319 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.444295 2574 scope.go:117] "RemoveContainer" containerID="ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3" Apr 24 22:33:07.444638 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:33:07.444610 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3\": container with ID starting with ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3 not found: ID does not exist" containerID="ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3" Apr 24 22:33:07.444723 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.444649 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3"} err="failed to get container status \"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3\": rpc error: code = NotFound desc = could not find container \"ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3\": container with ID starting with ed190ceb164c74eca1e4c18edb980c1d206daef1cddd80771972cc11d2fcdef3 not found: ID does not exist" Apr 24 22:33:07.457551 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.457529 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:33:07.461574 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.461554 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c7ccbc59-779tb"] Apr 24 22:33:07.467434 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.467415 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8d5c47b75-qtv9m"] Apr 24 22:33:07.471794 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:07.471765 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25776a2_f338_47e3_a66f_0dbbe5007841.slice/crio-29de72aa0740a3fc6159fd96b39e923c2ba2fcf122f72ec9930d5cc32e557f15 WatchSource:0}: Error finding container 29de72aa0740a3fc6159fd96b39e923c2ba2fcf122f72ec9930d5cc32e557f15: Status 404 returned error can't find the container with id 29de72aa0740a3fc6159fd96b39e923c2ba2fcf122f72ec9930d5cc32e557f15 Apr 24 22:33:07.488379 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.488357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d231644-f466-44c6-8279-d34341dcfa89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-k5srd\" (UID: \"7d231644-f466-44c6-8279-d34341dcfa89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:07.589504 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.589465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d231644-f466-44c6-8279-d34341dcfa89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-k5srd\" (UID: \"7d231644-f466-44c6-8279-d34341dcfa89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:07.592340 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.592313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d231644-f466-44c6-8279-d34341dcfa89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-k5srd\" (UID: \"7d231644-f466-44c6-8279-d34341dcfa89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:07.687098 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.687029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:07.769144 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.769072 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-576b4d6588-xn2zj"] Apr 24 22:33:07.808382 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.808349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-576b4d6588-xn2zj"] Apr 24 22:33:07.808610 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.808577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.811389 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 22:33:07.811389 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811249 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 22:33:07.811590 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 22:33:07.811654 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811627 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 22:33:07.811733 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 22:33:07.811987 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.811952 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g5tgn\"" Apr 24 22:33:07.816845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.816821 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 22:33:07.826127 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.826106 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd"] Apr 24 22:33:07.830128 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:07.830102 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d231644_f466_44c6_8279_d34341dcfa89.slice/crio-48e663719e9501686278d1ec3ea7533256126f925587019d300cb807bc96acc3 WatchSource:0}: Error finding container 48e663719e9501686278d1ec3ea7533256126f925587019d300cb807bc96acc3: Status 404 returned error can't find the container with id 48e663719e9501686278d1ec3ea7533256126f925587019d300cb807bc96acc3 Apr 24 22:33:07.891679 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pllkr\" (UniqueName: \"kubernetes.io/projected/87e464d4-7566-453e-bc71-f3a661e74c9d-kube-api-access-pllkr\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-metrics-client-ca\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891988 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891988 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-federate-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.891988 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.891942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.892123 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.892012 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-serving-certs-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993588 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pllkr\" (UniqueName: \"kubernetes.io/projected/87e464d4-7566-453e-bc71-f3a661e74c9d-kube-api-access-pllkr\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993588 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993896 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-metrics-client-ca\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993896 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993896 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-federate-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993896 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.993896 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-serving-certs-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.994173 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.993988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.994703 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.994674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-metrics-client-ca\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.995484 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.995414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.995697 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.995503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e464d4-7566-453e-bc71-f3a661e74c9d-serving-certs-ca-bundle\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.997613 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.997591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.997800 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.997656 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-telemeter-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.998059 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.997840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-federate-client-tls\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:07.998496 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:07.998474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/87e464d4-7566-453e-bc71-f3a661e74c9d-secret-telemeter-client\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:08.003404 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.003371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pllkr\" (UniqueName: \"kubernetes.io/projected/87e464d4-7566-453e-bc71-f3a661e74c9d-kube-api-access-pllkr\") pod \"telemeter-client-576b4d6588-xn2zj\" (UID: \"87e464d4-7566-453e-bc71-f3a661e74c9d\") " pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:08.120004 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.119954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" Apr 24 22:33:08.271660 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.271621 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-576b4d6588-xn2zj"] Apr 24 22:33:08.274087 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:08.274041 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e464d4_7566_453e_bc71_f3a661e74c9d.slice/crio-b7679c4753f57f41482c20a9b1dbbb717aa59761a20127fdeba79d1bd30d8888 WatchSource:0}: Error finding container b7679c4753f57f41482c20a9b1dbbb717aa59761a20127fdeba79d1bd30d8888: Status 404 returned error can't find the container with id b7679c4753f57f41482c20a9b1dbbb717aa59761a20127fdeba79d1bd30d8888 Apr 24 22:33:08.435130 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.435069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" event={"ID":"87e464d4-7566-453e-bc71-f3a661e74c9d","Type":"ContainerStarted","Data":"b7679c4753f57f41482c20a9b1dbbb717aa59761a20127fdeba79d1bd30d8888"} Apr 24 22:33:08.437371 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.437342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" event={"ID":"7d231644-f466-44c6-8279-d34341dcfa89","Type":"ContainerStarted","Data":"48e663719e9501686278d1ec3ea7533256126f925587019d300cb807bc96acc3"} Apr 24 22:33:08.438903 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.438870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" event={"ID":"c25776a2-f338-47e3-a66f-0dbbe5007841","Type":"ContainerStarted","Data":"29de72aa0740a3fc6159fd96b39e923c2ba2fcf122f72ec9930d5cc32e557f15"} Apr 24 22:33:08.848204 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.848170 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3150c082-9c90-4dd3-ad49-9ddb41172e83" path="/var/lib/kubelet/pods/3150c082-9c90-4dd3-ad49-9ddb41172e83/volumes" Apr 24 22:33:08.877353 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.877321 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:33:08.894366 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.893738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:08.895336 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.895287 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:33:08.900034 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.899849 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 22:33:08.900034 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.899869 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 22:33:08.900327 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.900147 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 22:33:08.900327 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.900295 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 22:33:08.901023 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.900562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v95hq\"" Apr 24 22:33:08.901023 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.900792 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 22:33:08.901709 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.901690 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 22:33:08.901941 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.901925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-amkj46n7vn01c\"" Apr 24 22:33:08.902181 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.902162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 22:33:08.902361 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.902346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 22:33:08.902445 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.902428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 22:33:08.902577 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.902559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 22:33:08.904259 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.903184 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 22:33:08.917637 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:08.916350 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 22:33:09.004501 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004501 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004683 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004683 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-config-out\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004683 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004683 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004817 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004817 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004817 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004817 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004817 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-web-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnv4\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-kube-api-access-jvnv4\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.004985 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004951 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.005204 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.004993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106590 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-web-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.106751 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnv4\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-kube-api-access-jvnv4\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.106993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-config-out\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.107443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.108306 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.107984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.108306 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.108000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.110290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.110011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-web-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.111117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.111093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.111263 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.111776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.111910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.111942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ab500-f85a-4bfe-8de4-08bb4710461e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112330 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.112180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.112330 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.112304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.113424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.113389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-config\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.114203 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.114145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ab500-f85a-4bfe-8de4-08bb4710461e-config-out\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.114203 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.114157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.115588 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.115552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.116064 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.116027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.117314 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.117287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c74ab500-f85a-4bfe-8de4-08bb4710461e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.119052 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.119029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnv4\" (UniqueName: \"kubernetes.io/projected/c74ab500-f85a-4bfe-8de4-08bb4710461e-kube-api-access-jvnv4\") pod \"prometheus-k8s-0\" (UID: \"c74ab500-f85a-4bfe-8de4-08bb4710461e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.212524 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.212491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:09.378111 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.378035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:33:09.380056 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.380033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hcwpm" Apr 24 22:33:09.983439 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:09.983383 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:33:09.986887 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:09.986848 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74ab500_f85a_4bfe_8de4_08bb4710461e.slice/crio-d1e736a7bddb9c29bd0bf253d9d3e346d296086ae80af83ab51cd46b5d446e54 WatchSource:0}: Error finding container d1e736a7bddb9c29bd0bf253d9d3e346d296086ae80af83ab51cd46b5d446e54: Status 404 returned error can't find the container with id d1e736a7bddb9c29bd0bf253d9d3e346d296086ae80af83ab51cd46b5d446e54 Apr 24 22:33:10.446365 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.446328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" event={"ID":"7d231644-f466-44c6-8279-d34341dcfa89","Type":"ContainerStarted","Data":"b0da618cfc80e541297d337e36fb4d89bf48d21bd821aef3676902085ef4406d"} Apr 24 22:33:10.446603 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.446582 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:10.447880 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.447854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" event={"ID":"c25776a2-f338-47e3-a66f-0dbbe5007841","Type":"ContainerStarted","Data":"ede55e5cd4d3fa42b908caed93314819dc8be2a469197c7623737d237db937d7"} Apr 24 22:33:10.449142 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.449109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"d1e736a7bddb9c29bd0bf253d9d3e346d296086ae80af83ab51cd46b5d446e54"} Apr 24 22:33:10.452080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.452062 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" Apr 24 22:33:10.462838 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.462791 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-k5srd" podStartSLOduration=1.4507025150000001 podStartE2EDuration="3.462778123s" podCreationTimestamp="2026-04-24 22:33:07 +0000 UTC" firstStartedPulling="2026-04-24 22:33:07.83243109 +0000 UTC m=+197.556023638" lastFinishedPulling="2026-04-24 22:33:09.844506684 +0000 UTC m=+199.568099246" observedRunningTime="2026-04-24 22:33:10.462447225 +0000 UTC m=+200.186039798" watchObservedRunningTime="2026-04-24 22:33:10.462778123 +0000 UTC m=+200.186370726" Apr 24 22:33:10.480909 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:10.480858 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" podStartSLOduration=2.11410008 podStartE2EDuration="4.48084208s" podCreationTimestamp="2026-04-24 22:33:06 +0000 UTC" firstStartedPulling="2026-04-24 22:33:07.473811383 +0000 UTC m=+197.197403932" lastFinishedPulling="2026-04-24 22:33:09.840553363 +0000 UTC m=+199.564145932" observedRunningTime="2026-04-24 22:33:10.478885753 +0000 UTC m=+200.202478323" watchObservedRunningTime="2026-04-24 22:33:10.48084208 +0000 UTC m=+200.204434651" Apr 24 22:33:11.026404 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.026375 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:11.054832 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.054796 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:11.054988 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.054938 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.127780 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.127751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.127930 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.127830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.127930 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.127890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.128055 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.127945 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.128055 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.128002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.128055 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.128024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.128055 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.128039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rghrc\" (UniqueName: \"kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228547 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228547 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228547 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rghrc\" (UniqueName: \"kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.228820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.228729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.229519 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.229487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.229641 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.229573 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.229704 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.229633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.229859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.229839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.231663 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.231642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.231782 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.231764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.237662 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.237630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rghrc\" (UniqueName: \"kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc\") pod \"console-58bf976dd7-qz957\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.366685 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.366649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:11.453909 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.453878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" event={"ID":"87e464d4-7566-453e-bc71-f3a661e74c9d","Type":"ContainerStarted","Data":"6b84ad2b1acf90c25df3d5e22be031681544d19ae09caf7c3b383bebd50c23b8"} Apr 24 22:33:11.629815 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:11.629783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:11.632574 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:11.632541 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f31d010_df2a_4a29_bfd1_df8ccfd31ffa.slice/crio-e73cbaeb56cce53eb75409327e5a93192c9e6965f4cf0a7200a420c327e840e5 WatchSource:0}: Error finding container e73cbaeb56cce53eb75409327e5a93192c9e6965f4cf0a7200a420c327e840e5: Status 404 returned error can't find the container with id e73cbaeb56cce53eb75409327e5a93192c9e6965f4cf0a7200a420c327e840e5 Apr 24 22:33:12.458281 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.458190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" event={"ID":"87e464d4-7566-453e-bc71-f3a661e74c9d","Type":"ContainerStarted","Data":"9dca88df5458a166aa316c6719b090dba30c1bd953b87aab77997b51fceed9dc"} Apr 24 22:33:12.458281 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.458232 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" event={"ID":"87e464d4-7566-453e-bc71-f3a661e74c9d","Type":"ContainerStarted","Data":"9b53aed75a56a8048604ce326d6f74146cc314218b7f8d2d6d892286d448d58b"} Apr 24 22:33:12.459471 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.459448 2574 generic.go:358] "Generic (PLEG): container finished" podID="c74ab500-f85a-4bfe-8de4-08bb4710461e" containerID="987b5895c5553d0c686dad78b241a19d76bda6974b08388c1d49d433a793e042" exitCode=0 Apr 24 22:33:12.459575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.459527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerDied","Data":"987b5895c5553d0c686dad78b241a19d76bda6974b08388c1d49d433a793e042"} Apr 24 22:33:12.460900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.460878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bf976dd7-qz957" event={"ID":"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa","Type":"ContainerStarted","Data":"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3"} Apr 24 22:33:12.460900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.460906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bf976dd7-qz957" event={"ID":"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa","Type":"ContainerStarted","Data":"e73cbaeb56cce53eb75409327e5a93192c9e6965f4cf0a7200a420c327e840e5"} Apr 24 22:33:12.481301 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.481230 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-576b4d6588-xn2zj" podStartSLOduration=1.7772557020000002 podStartE2EDuration="5.481217965s" podCreationTimestamp="2026-04-24 22:33:07 +0000 UTC" firstStartedPulling="2026-04-24 22:33:08.276525732 +0000 UTC m=+198.000118280" lastFinishedPulling="2026-04-24 22:33:11.980487995 +0000 UTC m=+201.704080543" observedRunningTime="2026-04-24 22:33:12.480155569 +0000 UTC m=+202.203748136" watchObservedRunningTime="2026-04-24 22:33:12.481217965 +0000 UTC m=+202.204810553" Apr 24 22:33:12.524783 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:12.524730 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58bf976dd7-qz957" podStartSLOduration=1.524716358 podStartE2EDuration="1.524716358s" podCreationTimestamp="2026-04-24 22:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:33:12.523423938 +0000 UTC m=+202.247016512" watchObservedRunningTime="2026-04-24 22:33:12.524716358 +0000 UTC m=+202.248308928" Apr 24 22:33:14.020297 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.020258 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:14.051312 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.051279 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:33:14.072219 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.072182 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:33:14.072383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.072367 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159381 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159381 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159546 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159546 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159546 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vjf\" (UniqueName: \"kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159679 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.159679 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.159656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261093 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vjf\" (UniqueName: \"kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261474 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.261474 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.262059 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.261993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.262059 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.262021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.262310 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.262257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.262310 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.262296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.263760 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.263737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.263879 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.263782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.269157 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.269137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vjf\" (UniqueName: \"kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf\") pod \"console-6b57788ffb-vzfrb\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:14.385137 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:14.385104 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:15.359509 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:15.359484 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:33:15.361934 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:33:15.361904 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55686459_acd5_4429_9b1f_2db2bbc3e5a2.slice/crio-ca4615e5073be5602b9e6aa0b0f3da5667c7034778b6f691c99a637b75b548a2 WatchSource:0}: Error finding container ca4615e5073be5602b9e6aa0b0f3da5667c7034778b6f691c99a637b75b548a2: Status 404 returned error can't find the container with id ca4615e5073be5602b9e6aa0b0f3da5667c7034778b6f691c99a637b75b548a2 Apr 24 22:33:15.471310 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:15.471278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b57788ffb-vzfrb" event={"ID":"55686459-acd5-4429-9b1f-2db2bbc3e5a2","Type":"ContainerStarted","Data":"ca4615e5073be5602b9e6aa0b0f3da5667c7034778b6f691c99a637b75b548a2"} Apr 24 22:33:16.480085 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:16.480025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b57788ffb-vzfrb" event={"ID":"55686459-acd5-4429-9b1f-2db2bbc3e5a2","Type":"ContainerStarted","Data":"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a"} Apr 24 22:33:16.482266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:16.482241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"ef21c2984dd09f55e5c5e145029fb97c6865b8a0527d59f3cfa32788ae8aa586"} Apr 24 22:33:16.482266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:16.482270 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"b09bfa3b08c8a427a49fdb002d1738a5ff69e87c0ea514780b2bd924f6c419a2"} Apr 24 22:33:16.500209 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:16.500156 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b57788ffb-vzfrb" podStartSLOduration=2.500139484 podStartE2EDuration="2.500139484s" podCreationTimestamp="2026-04-24 22:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:33:16.498491273 +0000 UTC m=+206.222083844" watchObservedRunningTime="2026-04-24 22:33:16.500139484 +0000 UTC m=+206.223732055" Apr 24 22:33:18.493316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:18.493284 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"76bb991737f6929154ef877173a0d08a48ee7fffbd029d87aeb9be116e072d35"} Apr 24 22:33:18.493682 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:18.493326 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"c74c6aeb6a4c8ae0272ea5be5d7ac28af4ee3c1a54539690b8a5605bd1a1bc33"} Apr 24 22:33:18.493682 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:18.493339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"e22b3ed9d1fc8b10c78468b2eaee7a0407c681edf001d50a7561f4588b03fe85"} Apr 24 22:33:19.499660 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:19.499624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c74ab500-f85a-4bfe-8de4-08bb4710461e","Type":"ContainerStarted","Data":"d8b47278b7be3125c90905fe42b08b60ea4bf92adca4b577f6a8348b1e6bc3a3"} Apr 24 22:33:19.529156 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:19.529113 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.419061481 podStartE2EDuration="11.529097987s" podCreationTimestamp="2026-04-24 22:33:08 +0000 UTC" firstStartedPulling="2026-04-24 22:33:09.989915899 +0000 UTC m=+199.713508461" lastFinishedPulling="2026-04-24 22:33:18.099952403 +0000 UTC m=+207.823544967" observedRunningTime="2026-04-24 22:33:19.526750343 +0000 UTC m=+209.250342953" watchObservedRunningTime="2026-04-24 22:33:19.529097987 +0000 UTC m=+209.252690557" Apr 24 22:33:21.367193 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:21.367157 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:24.213107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.213074 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:24.385978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.385928 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:24.386130 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.385996 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:24.390764 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.390746 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:24.523979 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.523886 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:33:24.569104 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:24.569078 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:33:27.310612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:27.310580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:27.310612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:27.310621 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:29.536551 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:29.536520 2574 generic.go:358] "Generic (PLEG): container finished" podID="e0595875-db43-466a-aa45-15f3138253c4" containerID="22571ce282aef417bb42d60157e895c2d6ed6345e3829e28da29299f891e39ab" exitCode=0 Apr 24 22:33:29.536929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:29.536593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" event={"ID":"e0595875-db43-466a-aa45-15f3138253c4","Type":"ContainerDied","Data":"22571ce282aef417bb42d60157e895c2d6ed6345e3829e28da29299f891e39ab"} Apr 24 22:33:29.536991 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:29.536945 2574 scope.go:117] "RemoveContainer" containerID="22571ce282aef417bb42d60157e895c2d6ed6345e3829e28da29299f891e39ab" Apr 24 22:33:30.542043 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:30.542005 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-m7xn9" event={"ID":"e0595875-db43-466a-aa45-15f3138253c4","Type":"ContainerStarted","Data":"35b937ccafacdd8ff612f0fedcce0fcc42bda86f9c2d01761f4b071650d4350d"} Apr 24 22:33:31.516453 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.516396 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-797d95865c-xk2x7" podUID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" containerName="console" containerID="cri-o://69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c" gracePeriod=15 Apr 24 22:33:31.803589 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.803559 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797d95865c-xk2x7_67a49298-1d9a-4ed5-ae8c-f018ab319d9e/console/0.log" Apr 24 22:33:31.803865 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.803625 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:33:31.921537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921502 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.921706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.921706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921595 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.921706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921654 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg86n\" (UniqueName: \"kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.921706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921675 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.921706 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.921697 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca\") pod \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\" (UID: \"67a49298-1d9a-4ed5-ae8c-f018ab319d9e\") " Apr 24 22:33:31.922125 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.922092 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config" (OuterVolumeSpecName: "console-config") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:31.922237 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.922138 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca" (OuterVolumeSpecName: "service-ca") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:31.922237 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.922147 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:31.923935 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.923906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:31.924037 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.923951 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:31.924037 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:31.923998 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n" (OuterVolumeSpecName: "kube-api-access-qg86n") pod "67a49298-1d9a-4ed5-ae8c-f018ab319d9e" (UID: "67a49298-1d9a-4ed5-ae8c-f018ab319d9e"). InnerVolumeSpecName "kube-api-access-qg86n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:32.022801 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022766 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.022801 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022794 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.022801 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022804 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qg86n\" (UniqueName: \"kubernetes.io/projected/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-kube-api-access-qg86n\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.023062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022814 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.023062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022823 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.023062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.022858 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67a49298-1d9a-4ed5-ae8c-f018ab319d9e-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:32.550830 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550800 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797d95865c-xk2x7_67a49298-1d9a-4ed5-ae8c-f018ab319d9e/console/0.log" Apr 24 22:33:32.551061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550852 2574 generic.go:358] "Generic (PLEG): container finished" podID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" containerID="69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c" exitCode=2 Apr 24 22:33:32.551061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550885 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797d95865c-xk2x7" event={"ID":"67a49298-1d9a-4ed5-ae8c-f018ab319d9e","Type":"ContainerDied","Data":"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c"} Apr 24 22:33:32.551061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550921 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797d95865c-xk2x7" Apr 24 22:33:32.551061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550938 2574 scope.go:117] "RemoveContainer" containerID="69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c" Apr 24 22:33:32.551061 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.550927 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797d95865c-xk2x7" event={"ID":"67a49298-1d9a-4ed5-ae8c-f018ab319d9e","Type":"ContainerDied","Data":"28d60490dab01a92f77a97b3d83ca28043de8b93a39e0ec30f7e842c2a81b7f6"} Apr 24 22:33:32.559495 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.559476 2574 scope.go:117] "RemoveContainer" containerID="69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c" Apr 24 22:33:32.559739 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:33:32.559721 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c\": container with ID starting with 69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c not found: ID does not exist" containerID="69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c" Apr 24 22:33:32.559842 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.559752 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c"} err="failed to get container status \"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c\": rpc error: code = NotFound desc = could not find container \"69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c\": container with ID starting with 69f54502508ad25b83e6fec79c8b1e3d75ff561d02acedd8dcf8f423f10e6b8c not found: ID does not exist" Apr 24 22:33:32.571317 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.571291 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:33:32.575026 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.575004 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-797d95865c-xk2x7"] Apr 24 22:33:32.847005 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:32.846923 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" path="/var/lib/kubelet/pods/67a49298-1d9a-4ed5-ae8c-f018ab319d9e/volumes" Apr 24 22:33:39.486761 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.486723 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58bf976dd7-qz957" podUID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" containerName="console" containerID="cri-o://eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3" gracePeriod=15 Apr 24 22:33:39.749162 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.749139 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bf976dd7-qz957_7f31d010-df2a-4a29-bfd1-df8ccfd31ffa/console/0.log" Apr 24 22:33:39.749266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.749197 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:39.886935 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.886903 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rghrc\" (UniqueName: \"kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.886935 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.886937 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887169 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.886980 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887169 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887097 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887169 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887135 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887177 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887214 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert\") pod \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\" (UID: \"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa\") " Apr 24 22:33:39.887523 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887492 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config" (OuterVolumeSpecName: "console-config") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:39.887608 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887525 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca" (OuterVolumeSpecName: "service-ca") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:39.887663 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887603 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:39.887743 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.887696 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:39.889112 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.889093 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:39.889501 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.889486 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:39.889561 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.889517 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc" (OuterVolumeSpecName: "kube-api-access-rghrc") pod "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" (UID: "7f31d010-df2a-4a29-bfd1-df8ccfd31ffa"). InnerVolumeSpecName "kube-api-access-rghrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988294 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988324 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988341 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988355 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988369 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rghrc\" (UniqueName: \"kubernetes.io/projected/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-kube-api-access-rghrc\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988383 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988382 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:39.988702 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:39.988395 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:40.586628 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586600 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bf976dd7-qz957_7f31d010-df2a-4a29-bfd1-df8ccfd31ffa/console/0.log" Apr 24 22:33:40.587046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586639 2574 generic.go:358] "Generic (PLEG): container finished" podID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" containerID="eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3" exitCode=2 Apr 24 22:33:40.587046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bf976dd7-qz957" event={"ID":"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa","Type":"ContainerDied","Data":"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3"} Apr 24 22:33:40.587046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bf976dd7-qz957" event={"ID":"7f31d010-df2a-4a29-bfd1-df8ccfd31ffa","Type":"ContainerDied","Data":"e73cbaeb56cce53eb75409327e5a93192c9e6965f4cf0a7200a420c327e840e5"} Apr 24 22:33:40.587046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586701 2574 scope.go:117] "RemoveContainer" containerID="eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3" Apr 24 22:33:40.587046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.586724 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bf976dd7-qz957" Apr 24 22:33:40.595281 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.595260 2574 scope.go:117] "RemoveContainer" containerID="eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3" Apr 24 22:33:40.595563 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:33:40.595541 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3\": container with ID starting with eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3 not found: ID does not exist" containerID="eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3" Apr 24 22:33:40.595612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.595570 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3"} err="failed to get container status \"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3\": rpc error: code = NotFound desc = could not find container \"eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3\": container with ID starting with eea31cc608447fcf06697fb817520027a1118b7bf80f07c28692c157905b95a3 not found: ID does not exist" Apr 24 22:33:40.608050 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.608030 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:40.611465 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.611445 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58bf976dd7-qz957"] Apr 24 22:33:40.846265 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:40.846180 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" path="/var/lib/kubelet/pods/7f31d010-df2a-4a29-bfd1-df8ccfd31ffa/volumes" Apr 24 22:33:47.316719 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:47.316693 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:47.320649 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:47.320625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8d5c47b75-qtv9m" Apr 24 22:33:49.589114 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.589070 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c9d9ddd69-hxpjz" podUID="815bae00-356b-484d-9532-37027c9aae29" containerName="console" containerID="cri-o://9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515" gracePeriod=15 Apr 24 22:33:49.872749 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.872719 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c9d9ddd69-hxpjz_815bae00-356b-484d-9532-37027c9aae29/console/0.log" Apr 24 22:33:49.872868 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.872779 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:49.981290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981258 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981290 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981294 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wg9d\" (UniqueName: \"kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981320 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981339 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981356 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981386 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981469 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca\") pod \"815bae00-356b-484d-9532-37027c9aae29\" (UID: \"815bae00-356b-484d-9532-37027c9aae29\") " Apr 24 22:33:49.981775 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981717 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:49.982075 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981850 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config" (OuterVolumeSpecName: "console-config") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:49.982075 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981939 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:49.982075 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981981 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:49.982075 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.981990 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca" (OuterVolumeSpecName: "service-ca") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:49.982362 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.982107 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:49.983668 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.983643 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:49.983668 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.983661 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:49.983804 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:49.983703 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d" (OuterVolumeSpecName: "kube-api-access-2wg9d") pod "815bae00-356b-484d-9532-37027c9aae29" (UID: "815bae00-356b-484d-9532-37027c9aae29"). InnerVolumeSpecName "kube-api-access-2wg9d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:50.083424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.083383 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:50.083424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.083417 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wg9d\" (UniqueName: \"kubernetes.io/projected/815bae00-356b-484d-9532-37027c9aae29-kube-api-access-2wg9d\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:50.083424 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.083433 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:50.083656 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.083447 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/815bae00-356b-484d-9532-37027c9aae29-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:50.083656 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.083460 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/815bae00-356b-484d-9532-37027c9aae29-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:33:50.620276 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620247 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c9d9ddd69-hxpjz_815bae00-356b-484d-9532-37027c9aae29/console/0.log" Apr 24 22:33:50.620689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620285 2574 generic.go:358] "Generic (PLEG): container finished" podID="815bae00-356b-484d-9532-37027c9aae29" containerID="9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515" exitCode=2 Apr 24 22:33:50.620689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c9d9ddd69-hxpjz" event={"ID":"815bae00-356b-484d-9532-37027c9aae29","Type":"ContainerDied","Data":"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515"} Apr 24 22:33:50.620689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c9d9ddd69-hxpjz" event={"ID":"815bae00-356b-484d-9532-37027c9aae29","Type":"ContainerDied","Data":"a03e971326b5c751838f4408c11b41160b704ba739bb7811ba39a1ae0acab14e"} Apr 24 22:33:50.620689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620360 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c9d9ddd69-hxpjz" Apr 24 22:33:50.620689 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.620374 2574 scope.go:117] "RemoveContainer" containerID="9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515" Apr 24 22:33:50.628796 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.628780 2574 scope.go:117] "RemoveContainer" containerID="9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515" Apr 24 22:33:50.629066 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:33:50.629044 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515\": container with ID starting with 9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515 not found: ID does not exist" containerID="9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515" Apr 24 22:33:50.629147 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.629073 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515"} err="failed to get container status \"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515\": rpc error: code = NotFound desc = could not find container \"9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515\": container with ID starting with 9f4c34aa62047bf88c2669eaa252359889ba20fd39733a06727947251c91e515 not found: ID does not exist" Apr 24 22:33:50.640405 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.640379 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:33:50.645818 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.645796 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c9d9ddd69-hxpjz"] Apr 24 22:33:50.846765 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:33:50.846732 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815bae00-356b-484d-9532-37027c9aae29" path="/var/lib/kubelet/pods/815bae00-356b-484d-9532-37027c9aae29/volumes" Apr 24 22:34:02.688839 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:02.688801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:34:02.691140 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:02.691117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c4cf71-fe26-4a65-bc22-b98bb5827d73-metrics-certs\") pod \"network-metrics-daemon-8ztg8\" (UID: \"d0c4cf71-fe26-4a65-bc22-b98bb5827d73\") " pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:34:02.946635 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:02.946556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:34:02.954575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:02.954554 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ztg8" Apr 24 22:34:03.074201 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:03.074169 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ztg8"] Apr 24 22:34:03.076929 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:34:03.076902 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c4cf71_fe26_4a65_bc22_b98bb5827d73.slice/crio-c2e396a0c4c1eaa96ea5a095df0a38b01b6ae5055af9d67946984cfd92a6a01d WatchSource:0}: Error finding container c2e396a0c4c1eaa96ea5a095df0a38b01b6ae5055af9d67946984cfd92a6a01d: Status 404 returned error can't find the container with id c2e396a0c4c1eaa96ea5a095df0a38b01b6ae5055af9d67946984cfd92a6a01d Apr 24 22:34:03.659095 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:03.659056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ztg8" event={"ID":"d0c4cf71-fe26-4a65-bc22-b98bb5827d73","Type":"ContainerStarted","Data":"c2e396a0c4c1eaa96ea5a095df0a38b01b6ae5055af9d67946984cfd92a6a01d"} Apr 24 22:34:04.664476 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:04.664438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ztg8" event={"ID":"d0c4cf71-fe26-4a65-bc22-b98bb5827d73","Type":"ContainerStarted","Data":"784523461c6d9b4215621e84cc8e5896f81113927ef2006333f34a00d57fb4d7"} Apr 24 22:34:04.664832 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:04.664482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ztg8" event={"ID":"d0c4cf71-fe26-4a65-bc22-b98bb5827d73","Type":"ContainerStarted","Data":"677ce7d36ffecbd16ef7a909867fce3138b0fbf4c3956ce466dc26be9ec06def"} Apr 24 22:34:04.682080 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:04.682033 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8ztg8" podStartSLOduration=252.705228195 podStartE2EDuration="4m13.682019699s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:34:03.079120797 +0000 UTC m=+252.802713345" lastFinishedPulling="2026-04-24 22:34:04.0559123 +0000 UTC m=+253.779504849" observedRunningTime="2026-04-24 22:34:04.681060409 +0000 UTC m=+254.404653005" watchObservedRunningTime="2026-04-24 22:34:04.682019699 +0000 UTC m=+254.405612266" Apr 24 22:34:09.212892 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:09.212842 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:09.231612 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:09.231585 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:09.693923 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:09.693897 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:27.349667 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.349630 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:34:27.350110 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350094 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" containerName="console" Apr 24 22:34:27.350152 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350110 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" containerName="console" Apr 24 22:34:27.350152 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350127 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815bae00-356b-484d-9532-37027c9aae29" containerName="console" Apr 24 22:34:27.350152 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350135 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="815bae00-356b-484d-9532-37027c9aae29" containerName="console" Apr 24 22:34:27.350241 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350159 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" containerName="console" Apr 24 22:34:27.350241 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350168 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" containerName="console" Apr 24 22:34:27.350241 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350240 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a49298-1d9a-4ed5-ae8c-f018ab319d9e" containerName="console" Apr 24 22:34:27.350325 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350251 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f31d010-df2a-4a29-bfd1-df8ccfd31ffa" containerName="console" Apr 24 22:34:27.350325 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.350262 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="815bae00-356b-484d-9532-37027c9aae29" containerName="console" Apr 24 22:34:27.355635 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.355616 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.366953 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.366927 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:34:27.497690 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.497861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.497861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.497861 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.498105 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.498105 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4bqh\" (UniqueName: \"kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.498105 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.497990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598658 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598658 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4bqh\" (UniqueName: \"kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.598929 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.599170 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.598939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.599463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.599433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.599572 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.599490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.599619 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.599605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.599665 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.599620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.601372 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.601320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.601372 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.601348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.607174 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.607155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4bqh\" (UniqueName: \"kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh\") pod \"console-79674f6858-rbdnn\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.665223 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.665185 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:27.791566 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:27.791543 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:34:27.793569 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:34:27.793540 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03578f00_0c49_4106_b29d_dd1d222a1078.slice/crio-7cc7dcb95981f100c11a5e8cdd5e8500527ec60cf1910d69dee7ec29732cb620 WatchSource:0}: Error finding container 7cc7dcb95981f100c11a5e8cdd5e8500527ec60cf1910d69dee7ec29732cb620: Status 404 returned error can't find the container with id 7cc7dcb95981f100c11a5e8cdd5e8500527ec60cf1910d69dee7ec29732cb620 Apr 24 22:34:28.735897 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:28.735857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79674f6858-rbdnn" event={"ID":"03578f00-0c49-4106-b29d-dd1d222a1078","Type":"ContainerStarted","Data":"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63"} Apr 24 22:34:28.735897 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:28.735901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79674f6858-rbdnn" event={"ID":"03578f00-0c49-4106-b29d-dd1d222a1078","Type":"ContainerStarted","Data":"7cc7dcb95981f100c11a5e8cdd5e8500527ec60cf1910d69dee7ec29732cb620"} Apr 24 22:34:28.756555 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:28.756507 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79674f6858-rbdnn" podStartSLOduration=1.756491928 podStartE2EDuration="1.756491928s" podCreationTimestamp="2026-04-24 22:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:34:28.754249495 +0000 UTC m=+278.477842064" watchObservedRunningTime="2026-04-24 22:34:28.756491928 +0000 UTC m=+278.480084499" Apr 24 22:34:37.666268 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:37.666214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:37.666766 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:37.666511 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:37.671074 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:37.671053 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:37.769177 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:37.769149 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:34:37.816780 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:37.816746 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:34:50.766450 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:50.766418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:34:50.767065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:50.766924 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:34:50.776086 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:34:50.776062 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:35:02.840859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:02.840821 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b57788ffb-vzfrb" podUID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" containerName="console" containerID="cri-o://4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a" gracePeriod=15 Apr 24 22:35:03.102553 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.102528 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b57788ffb-vzfrb_55686459-acd5-4429-9b1f-2db2bbc3e5a2/console/0.log" Apr 24 22:35:03.102672 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.102586 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:35:03.191947 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.191912 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.191947 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.191952 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192178 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192006 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192178 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192113 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192178 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192145 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192328 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192181 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2vjf\" (UniqueName: \"kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192328 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config\") pod \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\" (UID: \"55686459-acd5-4429-9b1f-2db2bbc3e5a2\") " Apr 24 22:35:03.192443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192337 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:35:03.192550 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192524 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:35:03.192671 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192553 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.192671 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192589 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:35:03.192671 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.192633 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config" (OuterVolumeSpecName: "console-config") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:35:03.194180 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.194153 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:35:03.194337 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.194319 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:35:03.194418 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.194396 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf" (OuterVolumeSpecName: "kube-api-access-j2vjf") pod "55686459-acd5-4429-9b1f-2db2bbc3e5a2" (UID: "55686459-acd5-4429-9b1f-2db2bbc3e5a2"). InnerVolumeSpecName "kube-api-access-j2vjf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:35:03.292933 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292898 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.292933 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292925 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.292933 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292935 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55686459-acd5-4429-9b1f-2db2bbc3e5a2-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.292933 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292943 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.293209 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292952 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55686459-acd5-4429-9b1f-2db2bbc3e5a2-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.293209 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.292976 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2vjf\" (UniqueName: \"kubernetes.io/projected/55686459-acd5-4429-9b1f-2db2bbc3e5a2-kube-api-access-j2vjf\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:35:03.840243 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840217 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b57788ffb-vzfrb_55686459-acd5-4429-9b1f-2db2bbc3e5a2/console/0.log" Apr 24 22:35:03.840419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840253 2574 generic.go:358] "Generic (PLEG): container finished" podID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" containerID="4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a" exitCode=2 Apr 24 22:35:03.840419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b57788ffb-vzfrb" event={"ID":"55686459-acd5-4429-9b1f-2db2bbc3e5a2","Type":"ContainerDied","Data":"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a"} Apr 24 22:35:03.840419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b57788ffb-vzfrb" event={"ID":"55686459-acd5-4429-9b1f-2db2bbc3e5a2","Type":"ContainerDied","Data":"ca4615e5073be5602b9e6aa0b0f3da5667c7034778b6f691c99a637b75b548a2"} Apr 24 22:35:03.840419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840322 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b57788ffb-vzfrb" Apr 24 22:35:03.840419 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.840336 2574 scope.go:117] "RemoveContainer" containerID="4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a" Apr 24 22:35:03.848576 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.848446 2574 scope.go:117] "RemoveContainer" containerID="4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a" Apr 24 22:35:03.848766 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:35:03.848675 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a\": container with ID starting with 4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a not found: ID does not exist" containerID="4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a" Apr 24 22:35:03.848766 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.848698 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a"} err="failed to get container status \"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a\": rpc error: code = NotFound desc = could not find container \"4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a\": container with ID starting with 4a02805e183a7f5a3809db4da03d3fa723e0b8c534c25bd0f9db364d71694a6a not found: ID does not exist" Apr 24 22:35:03.864626 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.862506 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:35:03.866588 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:03.866566 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b57788ffb-vzfrb"] Apr 24 22:35:04.845701 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:35:04.845664 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" path="/var/lib/kubelet/pods/55686459-acd5-4429-9b1f-2db2bbc3e5a2/volumes" Apr 24 22:38:17.799575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.799537 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84867b96fc-rbb57"] Apr 24 22:38:17.800118 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.799840 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" containerName="console" Apr 24 22:38:17.800118 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.799851 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" containerName="console" Apr 24 22:38:17.800118 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.799911 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="55686459-acd5-4429-9b1f-2db2bbc3e5a2" containerName="console" Apr 24 22:38:17.803242 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.803216 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.812028 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.812001 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84867b96fc-rbb57"] Apr 24 22:38:17.874097 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njsgz\" (UniqueName: \"kubernetes.io/projected/5a693059-25b8-4d4d-89e9-694a049e62a0-kube-api-access-njsgz\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874097 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-console-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-service-ca\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-oauth-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-trusted-ca-bundle\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-oauth-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.874316 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.874292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975027 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.974994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975027 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njsgz\" (UniqueName: \"kubernetes.io/projected/5a693059-25b8-4d4d-89e9-694a049e62a0-kube-api-access-njsgz\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975261 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-console-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975261 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-service-ca\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975261 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-oauth-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975261 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-trusted-ca-bundle\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975461 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-oauth-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.975859 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.975835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-console-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.976049 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.976002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-service-ca\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.976123 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.976060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-trusted-ca-bundle\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.976123 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.976096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a693059-25b8-4d4d-89e9-694a049e62a0-oauth-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.977682 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.977655 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-serving-cert\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.977682 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.977663 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a693059-25b8-4d4d-89e9-694a049e62a0-console-oauth-config\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:17.984888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:17.984859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njsgz\" (UniqueName: \"kubernetes.io/projected/5a693059-25b8-4d4d-89e9-694a049e62a0-kube-api-access-njsgz\") pod \"console-84867b96fc-rbb57\" (UID: \"5a693059-25b8-4d4d-89e9-694a049e62a0\") " pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:18.115011 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:18.114943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:18.236498 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:18.236457 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84867b96fc-rbb57"] Apr 24 22:38:18.239127 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:38:18.239100 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a693059_25b8_4d4d_89e9_694a049e62a0.slice/crio-f85634c2a8e47762ab3ebb1241eded3e690d954e70ffd5230d858c2e468801c3 WatchSource:0}: Error finding container f85634c2a8e47762ab3ebb1241eded3e690d954e70ffd5230d858c2e468801c3: Status 404 returned error can't find the container with id f85634c2a8e47762ab3ebb1241eded3e690d954e70ffd5230d858c2e468801c3 Apr 24 22:38:18.240867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:18.240850 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:38:18.416844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:18.416750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84867b96fc-rbb57" event={"ID":"5a693059-25b8-4d4d-89e9-694a049e62a0","Type":"ContainerStarted","Data":"620fde1b1bdfb84ced29a9d1673050e5cd0492703303358f2a4cae030fa974a8"} Apr 24 22:38:18.416844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:18.416790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84867b96fc-rbb57" event={"ID":"5a693059-25b8-4d4d-89e9-694a049e62a0","Type":"ContainerStarted","Data":"f85634c2a8e47762ab3ebb1241eded3e690d954e70ffd5230d858c2e468801c3"} Apr 24 22:38:26.773329 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.773276 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84867b96fc-rbb57" podStartSLOduration=9.773261494 podStartE2EDuration="9.773261494s" podCreationTimestamp="2026-04-24 22:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:38:18.439091129 +0000 UTC m=+508.162683712" watchObservedRunningTime="2026-04-24 22:38:26.773261494 +0000 UTC m=+516.496854063" Apr 24 22:38:26.773917 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.773763 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-96ngm"] Apr 24 22:38:26.778315 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.778295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96ngm" Apr 24 22:38:26.781071 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.781048 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 22:38:26.781225 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.781074 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:38:26.781225 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.781124 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:38:26.782421 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.782393 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zj965\"" Apr 24 22:38:26.783901 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.783882 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-96ngm"] Apr 24 22:38:26.852412 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.852381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvl7\" (UniqueName: \"kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7\") pod \"s3-init-96ngm\" (UID: \"8ac16476-4c74-4b62-b80e-11e6fc65bd17\") " pod="kserve/s3-init-96ngm" Apr 24 22:38:26.953782 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.953750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znvl7\" (UniqueName: \"kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7\") pod \"s3-init-96ngm\" (UID: \"8ac16476-4c74-4b62-b80e-11e6fc65bd17\") " pod="kserve/s3-init-96ngm" Apr 24 22:38:26.963272 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:26.963245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvl7\" (UniqueName: \"kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7\") pod \"s3-init-96ngm\" (UID: \"8ac16476-4c74-4b62-b80e-11e6fc65bd17\") " pod="kserve/s3-init-96ngm" Apr 24 22:38:27.099887 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:27.099808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96ngm" Apr 24 22:38:27.425182 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:27.425058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-96ngm"] Apr 24 22:38:27.427735 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:38:27.427708 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ac16476_4c74_4b62_b80e_11e6fc65bd17.slice/crio-9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6 WatchSource:0}: Error finding container 9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6: Status 404 returned error can't find the container with id 9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6 Apr 24 22:38:27.442948 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:27.442925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96ngm" event={"ID":"8ac16476-4c74-4b62-b80e-11e6fc65bd17","Type":"ContainerStarted","Data":"9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6"} Apr 24 22:38:28.115672 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:28.115638 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:28.116171 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:28.115692 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:28.122749 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:28.122720 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:28.451785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:28.451686 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84867b96fc-rbb57" Apr 24 22:38:28.502881 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:28.502835 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:38:32.460162 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:32.460128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96ngm" event={"ID":"8ac16476-4c74-4b62-b80e-11e6fc65bd17","Type":"ContainerStarted","Data":"a867dd4a7a82c93219e36edca534690ca2c9c00f2eb4d1e3dfdfcbb8f6f95b13"} Apr 24 22:38:32.477769 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:32.477725 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-96ngm" podStartSLOduration=2.064497732 podStartE2EDuration="6.47770934s" podCreationTimestamp="2026-04-24 22:38:26 +0000 UTC" firstStartedPulling="2026-04-24 22:38:27.429544225 +0000 UTC m=+517.153136774" lastFinishedPulling="2026-04-24 22:38:31.842755832 +0000 UTC m=+521.566348382" observedRunningTime="2026-04-24 22:38:32.477069489 +0000 UTC m=+522.200662060" watchObservedRunningTime="2026-04-24 22:38:32.47770934 +0000 UTC m=+522.201301913" Apr 24 22:38:35.470096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:35.470060 2574 generic.go:358] "Generic (PLEG): container finished" podID="8ac16476-4c74-4b62-b80e-11e6fc65bd17" containerID="a867dd4a7a82c93219e36edca534690ca2c9c00f2eb4d1e3dfdfcbb8f6f95b13" exitCode=0 Apr 24 22:38:35.470447 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:35.470136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96ngm" event={"ID":"8ac16476-4c74-4b62-b80e-11e6fc65bd17","Type":"ContainerDied","Data":"a867dd4a7a82c93219e36edca534690ca2c9c00f2eb4d1e3dfdfcbb8f6f95b13"} Apr 24 22:38:36.601263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:36.601238 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96ngm" Apr 24 22:38:36.746734 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:36.746645 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znvl7\" (UniqueName: \"kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7\") pod \"8ac16476-4c74-4b62-b80e-11e6fc65bd17\" (UID: \"8ac16476-4c74-4b62-b80e-11e6fc65bd17\") " Apr 24 22:38:36.748849 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:36.748826 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7" (OuterVolumeSpecName: "kube-api-access-znvl7") pod "8ac16476-4c74-4b62-b80e-11e6fc65bd17" (UID: "8ac16476-4c74-4b62-b80e-11e6fc65bd17"). InnerVolumeSpecName "kube-api-access-znvl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:36.847356 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:36.847328 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znvl7\" (UniqueName: \"kubernetes.io/projected/8ac16476-4c74-4b62-b80e-11e6fc65bd17-kube-api-access-znvl7\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:37.480685 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:37.480656 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96ngm" Apr 24 22:38:37.480685 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:37.480666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96ngm" event={"ID":"8ac16476-4c74-4b62-b80e-11e6fc65bd17","Type":"ContainerDied","Data":"9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6"} Apr 24 22:38:37.480685 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:37.480691 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dbe065ebe9caad7e832b5abb543a9adf3c33f964ed52507df1bc614c77864f6" Apr 24 22:38:46.785054 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.785020 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:38:46.785499 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.785482 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ac16476-4c74-4b62-b80e-11e6fc65bd17" containerName="s3-init" Apr 24 22:38:46.785544 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.785503 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac16476-4c74-4b62-b80e-11e6fc65bd17" containerName="s3-init" Apr 24 22:38:46.785595 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.785584 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ac16476-4c74-4b62-b80e-11e6fc65bd17" containerName="s3-init" Apr 24 22:38:46.789096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.789079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.791557 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.791532 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:38:46.792604 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.792581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-58j9x\"" Apr 24 22:38:46.792718 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.792615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 24 22:38:46.792718 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.792649 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 24 22:38:46.792718 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.792581 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:38:46.796092 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.796070 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:38:46.832115 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.832083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.832266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.832128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.832266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.832201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.832266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.832232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.932797 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.932759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.932797 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.932791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.933065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.932881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.933243 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.933224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.933331 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.933252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.933610 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.933588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.935492 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.935468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:46.944266 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:46.944241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4\") pod \"isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:47.100762 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:47.100686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:38:47.229818 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:47.229784 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:38:47.232220 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:38:47.232184 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d081b6b_d731_4112_85d1_cd2fe0fe77da.slice/crio-5144c2a7037e36c000eb8f2f0962350ec9385797ead8dfc00a4f10a2bf5cb726 WatchSource:0}: Error finding container 5144c2a7037e36c000eb8f2f0962350ec9385797ead8dfc00a4f10a2bf5cb726: Status 404 returned error can't find the container with id 5144c2a7037e36c000eb8f2f0962350ec9385797ead8dfc00a4f10a2bf5cb726 Apr 24 22:38:47.509394 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:47.509356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerStarted","Data":"5144c2a7037e36c000eb8f2f0962350ec9385797ead8dfc00a4f10a2bf5cb726"} Apr 24 22:38:50.521329 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:50.521288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerStarted","Data":"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a"} Apr 24 22:38:53.530730 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.530668 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79674f6858-rbdnn" podUID="03578f00-0c49-4106-b29d-dd1d222a1078" containerName="console" containerID="cri-o://4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63" gracePeriod=15 Apr 24 22:38:53.765466 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.765445 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79674f6858-rbdnn_03578f00-0c49-4106-b29d-dd1d222a1078/console/0.log" Apr 24 22:38:53.765570 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.765501 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:38:53.786367 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786296 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4bqh\" (UniqueName: \"kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786367 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786345 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786386 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786425 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786462 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786484 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.786537 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.786518 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config\") pod \"03578f00-0c49-4106-b29d-dd1d222a1078\" (UID: \"03578f00-0c49-4106-b29d-dd1d222a1078\") " Apr 24 22:38:53.787590 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.787350 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca" (OuterVolumeSpecName: "service-ca") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:53.787756 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.787727 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config" (OuterVolumeSpecName: "console-config") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:53.788016 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.787989 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:53.788198 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.788173 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:53.789152 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.789118 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:53.789597 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.789574 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh" (OuterVolumeSpecName: "kube-api-access-g4bqh") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "kube-api-access-g4bqh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:53.789721 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.789616 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "03578f00-0c49-4106-b29d-dd1d222a1078" (UID: "03578f00-0c49-4106-b29d-dd1d222a1078"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:53.887299 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887271 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887299 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887297 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887307 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03578f00-0c49-4106-b29d-dd1d222a1078-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887315 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4bqh\" (UniqueName: \"kubernetes.io/projected/03578f00-0c49-4106-b29d-dd1d222a1078-kube-api-access-g4bqh\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887325 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887335 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:53.887463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:53.887345 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03578f00-0c49-4106-b29d-dd1d222a1078-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:38:54.533922 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.533840 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79674f6858-rbdnn_03578f00-0c49-4106-b29d-dd1d222a1078/console/0.log" Apr 24 22:38:54.533922 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.533885 2574 generic.go:358] "Generic (PLEG): container finished" podID="03578f00-0c49-4106-b29d-dd1d222a1078" containerID="4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63" exitCode=2 Apr 24 22:38:54.534411 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.533947 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79674f6858-rbdnn" Apr 24 22:38:54.534411 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.533983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79674f6858-rbdnn" event={"ID":"03578f00-0c49-4106-b29d-dd1d222a1078","Type":"ContainerDied","Data":"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63"} Apr 24 22:38:54.534411 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.534021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79674f6858-rbdnn" event={"ID":"03578f00-0c49-4106-b29d-dd1d222a1078","Type":"ContainerDied","Data":"7cc7dcb95981f100c11a5e8cdd5e8500527ec60cf1910d69dee7ec29732cb620"} Apr 24 22:38:54.534411 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.534038 2574 scope.go:117] "RemoveContainer" containerID="4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63" Apr 24 22:38:54.535558 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.535540 2574 generic.go:358] "Generic (PLEG): container finished" podID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerID="72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a" exitCode=0 Apr 24 22:38:54.535632 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.535571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerDied","Data":"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a"} Apr 24 22:38:54.546719 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.546703 2574 scope.go:117] "RemoveContainer" containerID="4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63" Apr 24 22:38:54.547004 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:38:54.546953 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63\": container with ID starting with 4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63 not found: ID does not exist" containerID="4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63" Apr 24 22:38:54.547073 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.547018 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63"} err="failed to get container status \"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63\": rpc error: code = NotFound desc = could not find container \"4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63\": container with ID starting with 4b9799ef542a5cab1b751d9f7cd16ea4522ea817c078218ae80bf967aaf7ec63 not found: ID does not exist" Apr 24 22:38:54.571299 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.571274 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:38:54.572610 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.572588 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79674f6858-rbdnn"] Apr 24 22:38:54.846552 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:38:54.846462 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03578f00-0c49-4106-b29d-dd1d222a1078" path="/var/lib/kubelet/pods/03578f00-0c49-4106-b29d-dd1d222a1078/volumes" Apr 24 22:39:07.583443 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:07.583364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerStarted","Data":"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895"} Apr 24 22:39:09.593472 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:09.593437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerStarted","Data":"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4"} Apr 24 22:39:09.593845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:09.593548 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:39:09.613376 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:09.613331 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podStartSLOduration=1.366776446 podStartE2EDuration="23.613318681s" podCreationTimestamp="2026-04-24 22:38:46 +0000 UTC" firstStartedPulling="2026-04-24 22:38:47.234093756 +0000 UTC m=+536.957686317" lastFinishedPulling="2026-04-24 22:39:09.480635997 +0000 UTC m=+559.204228552" observedRunningTime="2026-04-24 22:39:09.611439112 +0000 UTC m=+559.335031719" watchObservedRunningTime="2026-04-24 22:39:09.613318681 +0000 UTC m=+559.336911249" Apr 24 22:39:10.596666 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:10.596635 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:39:10.597976 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:10.597913 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:11.599599 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:11.599561 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:16.604062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:16.604033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:39:16.604606 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:16.604581 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:26.605409 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:26.605368 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:36.605275 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:36.605231 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:46.604996 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:46.604935 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:39:50.788150 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:50.788113 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:39:50.789708 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:50.789686 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:39:56.605397 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:39:56.605356 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:40:06.605054 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.604942 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:40:06.760140 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.760103 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:06.760449 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.760435 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03578f00-0c49-4106-b29d-dd1d222a1078" containerName="console" Apr 24 22:40:06.760509 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.760451 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="03578f00-0c49-4106-b29d-dd1d222a1078" containerName="console" Apr 24 22:40:06.760547 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.760522 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="03578f00-0c49-4106-b29d-dd1d222a1078" containerName="console" Apr 24 22:40:06.763384 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.763364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:06.765806 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.765787 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1b9dc-serving-cert\"" Apr 24 22:40:06.765919 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.765881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1b9dc-kube-rbac-proxy-sar-config\"" Apr 24 22:40:06.771176 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.771151 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:06.819836 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.819795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:06.820029 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.819856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:06.920989 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.920852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:06.920989 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.920943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:06.921198 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:40:06.921070 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-1b9dc-serving-cert: secret "switch-graph-1b9dc-serving-cert" not found Apr 24 22:40:06.921198 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:40:06.921148 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls podName:9f1af98a-c88d-49bb-af7e-c226b9119b89 nodeName:}" failed. No retries permitted until 2026-04-24 22:40:07.421132276 +0000 UTC m=+617.144724824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls") pod "switch-graph-1b9dc-5995d747dd-q5dbh" (UID: "9f1af98a-c88d-49bb-af7e-c226b9119b89") : secret "switch-graph-1b9dc-serving-cert" not found Apr 24 22:40:06.921550 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:06.921530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:07.425830 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:07.425791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:07.428134 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:07.428115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") pod \"switch-graph-1b9dc-5995d747dd-q5dbh\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:07.673913 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:07.673877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:07.794422 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:07.794390 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:07.797628 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:40:07.797600 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f1af98a_c88d_49bb_af7e_c226b9119b89.slice/crio-4e87aaee7b0e9b746ad97e2ed8888eccb6733da2431654536d6fe568fe54fae2 WatchSource:0}: Error finding container 4e87aaee7b0e9b746ad97e2ed8888eccb6733da2431654536d6fe568fe54fae2: Status 404 returned error can't find the container with id 4e87aaee7b0e9b746ad97e2ed8888eccb6733da2431654536d6fe568fe54fae2 Apr 24 22:40:08.770973 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:08.770910 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" event={"ID":"9f1af98a-c88d-49bb-af7e-c226b9119b89","Type":"ContainerStarted","Data":"4e87aaee7b0e9b746ad97e2ed8888eccb6733da2431654536d6fe568fe54fae2"} Apr 24 22:40:09.775226 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:09.775189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" event={"ID":"9f1af98a-c88d-49bb-af7e-c226b9119b89","Type":"ContainerStarted","Data":"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac"} Apr 24 22:40:09.775574 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:09.775302 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:09.793734 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:09.793691 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podStartSLOduration=1.986495982 podStartE2EDuration="3.793679918s" podCreationTimestamp="2026-04-24 22:40:06 +0000 UTC" firstStartedPulling="2026-04-24 22:40:07.799804534 +0000 UTC m=+617.523397082" lastFinishedPulling="2026-04-24 22:40:09.60698847 +0000 UTC m=+619.330581018" observedRunningTime="2026-04-24 22:40:09.792619343 +0000 UTC m=+619.516211913" watchObservedRunningTime="2026-04-24 22:40:09.793679918 +0000 UTC m=+619.517272488" Apr 24 22:40:15.783661 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:15.783632 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:16.605123 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:16.605094 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:40:16.927570 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:16.927493 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:16.927914 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:16.927708 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" containerID="cri-o://d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac" gracePeriod=30 Apr 24 22:40:20.782489 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:20.782452 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:25.782739 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:25.782704 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:30.782434 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:30.782390 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:30.782910 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:30.782487 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:35.782688 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:35.782650 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:40.782784 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:40.782740 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:45.782571 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:45.782532 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:46.718333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.718301 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:40:46.721602 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.721587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:46.723955 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.723925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 22:40:46.724107 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.724006 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 22:40:46.727670 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.727651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:40:46.767385 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.767349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:46.767557 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.767481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:46.868249 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.868218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:46.868721 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.868288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:46.868721 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:40:46.868377 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 24 22:40:46.868721 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:40:46.868437 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls podName:33b7f016-da48-4a91-9397-9afab6a1c89c nodeName:}" failed. No retries permitted until 2026-04-24 22:40:47.368420802 +0000 UTC m=+657.092013349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls") pod "model-chainer-85d456f99f-98r87" (UID: "33b7f016-da48-4a91-9397-9afab6a1c89c") : secret "model-chainer-serving-cert" not found Apr 24 22:40:46.869049 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:46.869029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:47.060262 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.060242 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:47.170900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.170865 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle\") pod \"9f1af98a-c88d-49bb-af7e-c226b9119b89\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " Apr 24 22:40:47.171153 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.170986 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") pod \"9f1af98a-c88d-49bb-af7e-c226b9119b89\" (UID: \"9f1af98a-c88d-49bb-af7e-c226b9119b89\") " Apr 24 22:40:47.171409 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.171331 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9f1af98a-c88d-49bb-af7e-c226b9119b89" (UID: "9f1af98a-c88d-49bb-af7e-c226b9119b89"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:47.173186 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.173152 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f1af98a-c88d-49bb-af7e-c226b9119b89" (UID: "9f1af98a-c88d-49bb-af7e-c226b9119b89"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:47.272497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.272420 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f1af98a-c88d-49bb-af7e-c226b9119b89-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:40:47.272497 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.272447 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1af98a-c88d-49bb-af7e-c226b9119b89-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:40:47.373571 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.373534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:47.375945 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.375916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") pod \"model-chainer-85d456f99f-98r87\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:47.632590 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.632550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:47.745844 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.745819 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:40:47.748538 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:40:47.748504 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b7f016_da48_4a91_9397_9afab6a1c89c.slice/crio-2143ac1c7e0a2d5edd92222a3ff9ccd4e2d3976933de77fb35a64f7af57bb8b2 WatchSource:0}: Error finding container 2143ac1c7e0a2d5edd92222a3ff9ccd4e2d3976933de77fb35a64f7af57bb8b2: Status 404 returned error can't find the container with id 2143ac1c7e0a2d5edd92222a3ff9ccd4e2d3976933de77fb35a64f7af57bb8b2 Apr 24 22:40:47.896621 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.896527 2574 generic.go:358] "Generic (PLEG): container finished" podID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerID="d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac" exitCode=0 Apr 24 22:40:47.896621 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.896582 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" Apr 24 22:40:47.896621 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.896616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" event={"ID":"9f1af98a-c88d-49bb-af7e-c226b9119b89","Type":"ContainerDied","Data":"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac"} Apr 24 22:40:47.897210 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.896652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh" event={"ID":"9f1af98a-c88d-49bb-af7e-c226b9119b89","Type":"ContainerDied","Data":"4e87aaee7b0e9b746ad97e2ed8888eccb6733da2431654536d6fe568fe54fae2"} Apr 24 22:40:47.897210 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.896667 2574 scope.go:117] "RemoveContainer" containerID="d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac" Apr 24 22:40:47.898009 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.897982 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" event={"ID":"33b7f016-da48-4a91-9397-9afab6a1c89c","Type":"ContainerStarted","Data":"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650"} Apr 24 22:40:47.898120 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.898017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" event={"ID":"33b7f016-da48-4a91-9397-9afab6a1c89c","Type":"ContainerStarted","Data":"2143ac1c7e0a2d5edd92222a3ff9ccd4e2d3976933de77fb35a64f7af57bb8b2"} Apr 24 22:40:47.898120 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.898049 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:47.905322 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.905300 2574 scope.go:117] "RemoveContainer" containerID="d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac" Apr 24 22:40:47.905577 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:40:47.905558 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac\": container with ID starting with d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac not found: ID does not exist" containerID="d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac" Apr 24 22:40:47.905633 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.905584 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac"} err="failed to get container status \"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac\": rpc error: code = NotFound desc = could not find container \"d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac\": container with ID starting with d65357cebb1d185f0af604860c8a61e32c7cefae44499b2f5e716467573007ac not found: ID does not exist" Apr 24 22:40:47.915765 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.915729 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podStartSLOduration=1.915719199 podStartE2EDuration="1.915719199s" podCreationTimestamp="2026-04-24 22:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:47.914682044 +0000 UTC m=+657.638274613" watchObservedRunningTime="2026-04-24 22:40:47.915719199 +0000 UTC m=+657.639311769" Apr 24 22:40:47.925907 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.925884 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:47.927223 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:47.927204 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1b9dc-5995d747dd-q5dbh"] Apr 24 22:40:48.846196 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:48.846164 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" path="/var/lib/kubelet/pods/9f1af98a-c88d-49bb-af7e-c226b9119b89/volumes" Apr 24 22:40:53.907263 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:53.907230 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:40:56.824158 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:56.824131 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:40:56.824532 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:56.824325 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" containerID="cri-o://fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650" gracePeriod=30 Apr 24 22:40:57.006505 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:57.006473 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:40:57.006799 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:57.006776 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" containerID="cri-o://20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895" gracePeriod=30 Apr 24 22:40:57.006888 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:57.006813 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kube-rbac-proxy" containerID="cri-o://da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4" gracePeriod=30 Apr 24 22:40:57.930001 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:57.929944 2574 generic.go:358] "Generic (PLEG): container finished" podID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerID="da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4" exitCode=2 Apr 24 22:40:57.930367 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:57.930017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerDied","Data":"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4"} Apr 24 22:40:58.905855 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:40:58.905813 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:01.232806 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.232783 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:41:01.294308 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294230 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls\") pod \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " Apr 24 22:41:01.294308 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294283 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4\") pod \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " Apr 24 22:41:01.294481 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294316 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location\") pod \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " Apr 24 22:41:01.294481 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294338 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\" (UID: \"4d081b6b-d731-4112-85d1-cd2fe0fe77da\") " Apr 24 22:41:01.294732 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294689 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d081b6b-d731-4112-85d1-cd2fe0fe77da" (UID: "4d081b6b-d731-4112-85d1-cd2fe0fe77da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:41:01.294842 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.294748 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "4d081b6b-d731-4112-85d1-cd2fe0fe77da" (UID: "4d081b6b-d731-4112-85d1-cd2fe0fe77da"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:41:01.296344 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.296321 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d081b6b-d731-4112-85d1-cd2fe0fe77da" (UID: "4d081b6b-d731-4112-85d1-cd2fe0fe77da"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:41:01.296420 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.296342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4" (OuterVolumeSpecName: "kube-api-access-ntcl4") pod "4d081b6b-d731-4112-85d1-cd2fe0fe77da" (UID: "4d081b6b-d731-4112-85d1-cd2fe0fe77da"). InnerVolumeSpecName "kube-api-access-ntcl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:41:01.395936 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.395894 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d081b6b-d731-4112-85d1-cd2fe0fe77da-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.395936 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.395930 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kube-api-access-ntcl4\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.396164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.395944 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d081b6b-d731-4112-85d1-cd2fe0fe77da-kserve-provision-location\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.396164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.395991 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d081b6b-d731-4112-85d1-cd2fe0fe77da-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.942043 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.942003 2574 generic.go:358] "Generic (PLEG): container finished" podID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerID="20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895" exitCode=0 Apr 24 22:41:01.942251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.942106 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" Apr 24 22:41:01.942251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.942099 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerDied","Data":"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895"} Apr 24 22:41:01.942251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.942219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m" event={"ID":"4d081b6b-d731-4112-85d1-cd2fe0fe77da","Type":"ContainerDied","Data":"5144c2a7037e36c000eb8f2f0962350ec9385797ead8dfc00a4f10a2bf5cb726"} Apr 24 22:41:01.942251 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.942234 2574 scope.go:117] "RemoveContainer" containerID="da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4" Apr 24 22:41:01.950178 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.950157 2574 scope.go:117] "RemoveContainer" containerID="20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895" Apr 24 22:41:01.957571 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.957555 2574 scope.go:117] "RemoveContainer" containerID="72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a" Apr 24 22:41:01.962984 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.962949 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:41:01.964772 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.964757 2574 scope.go:117] "RemoveContainer" containerID="da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4" Apr 24 22:41:01.965105 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:01.965063 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4\": container with ID starting with da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4 not found: ID does not exist" containerID="da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4" Apr 24 22:41:01.965333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.965102 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4"} err="failed to get container status \"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4\": rpc error: code = NotFound desc = could not find container \"da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4\": container with ID starting with da6226f51a10835d3d7e257bd8e8c1915d9da8709cb70ec8436162a96abb19b4 not found: ID does not exist" Apr 24 22:41:01.965333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.965137 2574 scope.go:117] "RemoveContainer" containerID="20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895" Apr 24 22:41:01.965482 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:01.965419 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895\": container with ID starting with 20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895 not found: ID does not exist" containerID="20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895" Apr 24 22:41:01.965482 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.965445 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895"} err="failed to get container status \"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895\": rpc error: code = NotFound desc = could not find container \"20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895\": container with ID starting with 20bee399824850624e6ad930454e8ef51916cc635e839a5d3e053717e8bc3895 not found: ID does not exist" Apr 24 22:41:01.965482 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.965464 2574 scope.go:117] "RemoveContainer" containerID="72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a" Apr 24 22:41:01.965736 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:01.965712 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a\": container with ID starting with 72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a not found: ID does not exist" containerID="72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a" Apr 24 22:41:01.965791 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.965743 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a"} err="failed to get container status \"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a\": rpc error: code = NotFound desc = could not find container \"72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a\": container with ID starting with 72e1184ee58c3bdb169deeee8f1989dafb79bf34f94f45efa5898f2812810c2a not found: ID does not exist" Apr 24 22:41:01.967031 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:01.967012 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-56dcdd5cdf-fdd6m"] Apr 24 22:41:02.846246 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:02.846206 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" path="/var/lib/kubelet/pods/4d081b6b-d731-4112-85d1-cd2fe0fe77da/volumes" Apr 24 22:41:03.905369 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:03.905334 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:08.905265 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:08.905221 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:08.905658 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:08.905353 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:41:13.905404 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:13.905368 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:17.184660 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.184628 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:41:17.185017 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.184992 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="storage-initializer" Apr 24 22:41:17.185017 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185003 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="storage-initializer" Apr 24 22:41:17.185017 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185016 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185022 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185040 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185045 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185052 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kube-rbac-proxy" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185057 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kube-rbac-proxy" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185101 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kube-rbac-proxy" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185112 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f1af98a-c88d-49bb-af7e-c226b9119b89" containerName="switch-graph-1b9dc" Apr 24 22:41:17.185117 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.185119 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d081b6b-d731-4112-85d1-cd2fe0fe77da" containerName="kserve-container" Apr 24 22:41:17.188062 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.188045 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.190476 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.190456 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e938d-serving-cert\"" Apr 24 22:41:17.190691 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.190676 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e938d-kube-rbac-proxy-sar-config\"" Apr 24 22:41:17.196570 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.196544 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:41:17.337096 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.337060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.337264 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.337118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.438466 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.438378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.438622 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.438479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.438622 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:17.438525 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e938d-serving-cert: secret "switch-graph-e938d-serving-cert" not found Apr 24 22:41:17.438622 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:17.438591 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls podName:d92fa55c-f6a7-4353-b1d9-a8935574e0fd nodeName:}" failed. No retries permitted until 2026-04-24 22:41:17.938573751 +0000 UTC m=+687.662166303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls") pod "switch-graph-e938d-798f566f4f-4lmhz" (UID: "d92fa55c-f6a7-4353-b1d9-a8935574e0fd") : secret "switch-graph-e938d-serving-cert" not found Apr 24 22:41:17.439112 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.439091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.944304 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.944266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:17.946707 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:17.946677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") pod \"switch-graph-e938d-798f566f4f-4lmhz\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:18.098887 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.098851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:18.217778 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.217694 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:41:18.220734 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:41:18.220699 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92fa55c_f6a7_4353_b1d9_a8935574e0fd.slice/crio-85f179fca184160b7bacad511c72c5e87a7b13da8766944cffaf3e4a3e1f1030 WatchSource:0}: Error finding container 85f179fca184160b7bacad511c72c5e87a7b13da8766944cffaf3e4a3e1f1030: Status 404 returned error can't find the container with id 85f179fca184160b7bacad511c72c5e87a7b13da8766944cffaf3e4a3e1f1030 Apr 24 22:41:18.905252 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.905214 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:18.996735 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.996699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" event={"ID":"d92fa55c-f6a7-4353-b1d9-a8935574e0fd","Type":"ContainerStarted","Data":"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b"} Apr 24 22:41:18.996735 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.996738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" event={"ID":"d92fa55c-f6a7-4353-b1d9-a8935574e0fd","Type":"ContainerStarted","Data":"85f179fca184160b7bacad511c72c5e87a7b13da8766944cffaf3e4a3e1f1030"} Apr 24 22:41:18.997006 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:18.996841 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:19.015151 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:19.015105 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podStartSLOduration=2.015093182 podStartE2EDuration="2.015093182s" podCreationTimestamp="2026-04-24 22:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:41:19.012773658 +0000 UTC m=+688.736366227" watchObservedRunningTime="2026-04-24 22:41:19.015093182 +0000 UTC m=+688.738685793" Apr 24 22:41:23.905740 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:23.905698 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:41:25.008229 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:25.008202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:41:26.954768 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:26.954742 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:41:27.023558 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.023525 2574 generic.go:358] "Generic (PLEG): container finished" podID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerID="fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650" exitCode=0 Apr 24 22:41:27.023714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.023580 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" Apr 24 22:41:27.023714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.023596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" event={"ID":"33b7f016-da48-4a91-9397-9afab6a1c89c","Type":"ContainerDied","Data":"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650"} Apr 24 22:41:27.023714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.023619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85d456f99f-98r87" event={"ID":"33b7f016-da48-4a91-9397-9afab6a1c89c","Type":"ContainerDied","Data":"2143ac1c7e0a2d5edd92222a3ff9ccd4e2d3976933de77fb35a64f7af57bb8b2"} Apr 24 22:41:27.023714 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.023633 2574 scope.go:117] "RemoveContainer" containerID="fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650" Apr 24 22:41:27.029660 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.029640 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle\") pod \"33b7f016-da48-4a91-9397-9afab6a1c89c\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " Apr 24 22:41:27.029760 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.029724 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") pod \"33b7f016-da48-4a91-9397-9afab6a1c89c\" (UID: \"33b7f016-da48-4a91-9397-9afab6a1c89c\") " Apr 24 22:41:27.030022 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.030004 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "33b7f016-da48-4a91-9397-9afab6a1c89c" (UID: "33b7f016-da48-4a91-9397-9afab6a1c89c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:41:27.031420 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.031401 2574 scope.go:117] "RemoveContainer" containerID="fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650" Apr 24 22:41:27.031671 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.031652 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33b7f016-da48-4a91-9397-9afab6a1c89c" (UID: "33b7f016-da48-4a91-9397-9afab6a1c89c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:41:27.031740 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:41:27.031674 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650\": container with ID starting with fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650 not found: ID does not exist" containerID="fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650" Apr 24 22:41:27.031740 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.031704 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650"} err="failed to get container status \"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650\": rpc error: code = NotFound desc = could not find container \"fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650\": container with ID starting with fc41390d447d5c137fa77059a6118ce9f75c11e0434d3c8e6b7341b953d5e650 not found: ID does not exist" Apr 24 22:41:27.130938 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.130903 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7f016-da48-4a91-9397-9afab6a1c89c-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:27.130938 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.130936 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b7f016-da48-4a91-9397-9afab6a1c89c-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:41:27.344562 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.344530 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:41:27.348163 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:27.348141 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85d456f99f-98r87"] Apr 24 22:41:28.845900 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:28.845866 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" path="/var/lib/kubelet/pods/33b7f016-da48-4a91-9397-9afab6a1c89c/volumes" Apr 24 22:41:56.997397 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:56.997349 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:41:56.997782 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:56.997711 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" Apr 24 22:41:56.997782 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:56.997722 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" Apr 24 22:41:56.997853 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:56.997788 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b7f016-da48-4a91-9397-9afab6a1c89c" containerName="model-chainer" Apr 24 22:41:57.002258 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.002241 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.006253 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.006224 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f8431-serving-cert\"" Apr 24 22:41:57.006374 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.006224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f8431-kube-rbac-proxy-sar-config\"" Apr 24 22:41:57.011220 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.011196 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:41:57.191156 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.191120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.191324 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.191195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.291721 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.291646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.291721 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.291696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.292286 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.292258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.294024 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.294004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls\") pod \"sequence-graph-f8431-7f98d55687-f4dl5\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.312321 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.312298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:57.428893 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:57.428868 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:41:57.431555 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:41:57.431525 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50b6da3_95f3_462c_96ea_3ccd90c93413.slice/crio-88119f459fd65ab3093c0e950c280807eca9631208a549d718f4e129fd501c0a WatchSource:0}: Error finding container 88119f459fd65ab3093c0e950c280807eca9631208a549d718f4e129fd501c0a: Status 404 returned error can't find the container with id 88119f459fd65ab3093c0e950c280807eca9631208a549d718f4e129fd501c0a Apr 24 22:41:58.121455 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:58.121417 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" event={"ID":"c50b6da3-95f3-462c-96ea-3ccd90c93413","Type":"ContainerStarted","Data":"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845"} Apr 24 22:41:58.121920 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:58.121459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" event={"ID":"c50b6da3-95f3-462c-96ea-3ccd90c93413","Type":"ContainerStarted","Data":"88119f459fd65ab3093c0e950c280807eca9631208a549d718f4e129fd501c0a"} Apr 24 22:41:58.121920 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:58.121601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:41:58.138341 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:41:58.138298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podStartSLOduration=2.138284649 podStartE2EDuration="2.138284649s" podCreationTimestamp="2026-04-24 22:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:41:58.13635159 +0000 UTC m=+727.859944161" watchObservedRunningTime="2026-04-24 22:41:58.138284649 +0000 UTC m=+727.861877219" Apr 24 22:42:04.130704 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:42:04.130675 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:44:50.815810 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:44:50.815779 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:44:50.816762 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:44:50.816737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:49:31.829768 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:31.829733 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:49:31.832426 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:31.829976 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" containerID="cri-o://aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b" gracePeriod=30 Apr 24 22:49:35.006339 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:35.006303 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:49:40.005854 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:40.005814 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:49:45.006513 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:45.006477 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:49:45.006902 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:45.006577 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:49:50.006980 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:50.006920 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:49:50.839698 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:50.839664 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:49:50.841849 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:50.841825 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:49:55.005980 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:49:55.005918 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:00.006333 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:00.006292 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:01.963482 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:01.963460 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:50:02.072261 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.072230 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle\") pod \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " Apr 24 22:50:02.072426 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.072276 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") pod \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\" (UID: \"d92fa55c-f6a7-4353-b1d9-a8935574e0fd\") " Apr 24 22:50:02.072606 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.072583 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d92fa55c-f6a7-4353-b1d9-a8935574e0fd" (UID: "d92fa55c-f6a7-4353-b1d9-a8935574e0fd"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:02.074240 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.074220 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d92fa55c-f6a7-4353-b1d9-a8935574e0fd" (UID: "d92fa55c-f6a7-4353-b1d9-a8935574e0fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:02.173111 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.173080 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:50:02.173111 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.173113 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d92fa55c-f6a7-4353-b1d9-a8935574e0fd-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:50:02.534544 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.534456 2574 generic.go:358] "Generic (PLEG): container finished" podID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerID="aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b" exitCode=0 Apr 24 22:50:02.534544 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.534523 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" Apr 24 22:50:02.534764 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.534540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" event={"ID":"d92fa55c-f6a7-4353-b1d9-a8935574e0fd","Type":"ContainerDied","Data":"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b"} Apr 24 22:50:02.534764 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.534580 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz" event={"ID":"d92fa55c-f6a7-4353-b1d9-a8935574e0fd","Type":"ContainerDied","Data":"85f179fca184160b7bacad511c72c5e87a7b13da8766944cffaf3e4a3e1f1030"} Apr 24 22:50:02.534764 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.534599 2574 scope.go:117] "RemoveContainer" containerID="aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b" Apr 24 22:50:02.542445 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.542428 2574 scope.go:117] "RemoveContainer" containerID="aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b" Apr 24 22:50:02.542706 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:50:02.542688 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b\": container with ID starting with aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b not found: ID does not exist" containerID="aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b" Apr 24 22:50:02.542784 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.542712 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b"} err="failed to get container status \"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b\": rpc error: code = NotFound desc = could not find container \"aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b\": container with ID starting with aee443e8d25a0f3db2b3b93d78cb6f78c00de020a00b7faadb0e48b60fab7d7b not found: ID does not exist" Apr 24 22:50:02.554348 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.554326 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:50:02.556386 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.556368 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e938d-798f566f4f-4lmhz"] Apr 24 22:50:02.851041 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:02.850947 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" path="/var/lib/kubelet/pods/d92fa55c-f6a7-4353-b1d9-a8935574e0fd/volumes" Apr 24 22:50:11.764700 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:11.764668 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:50:11.765164 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:11.764919 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" containerID="cri-o://31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845" gracePeriod=30 Apr 24 22:50:14.128869 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:14.128825 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:19.128805 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:19.128758 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:24.128845 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:24.128804 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:24.129305 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:24.128930 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:50:29.129867 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:29.129817 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:32.063463 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.063431 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:50:32.063926 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.063886 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" Apr 24 22:50:32.063926 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.063904 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" Apr 24 22:50:32.064065 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.064033 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d92fa55c-f6a7-4353-b1d9-a8935574e0fd" containerName="switch-graph-e938d" Apr 24 22:50:32.066869 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.066848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.069189 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.069170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-5c921-serving-cert\"" Apr 24 22:50:32.069284 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.069173 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-5c921-kube-rbac-proxy-sar-config\"" Apr 24 22:50:32.073789 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.073771 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:50:32.231990 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.231929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.232158 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.232084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.332901 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.332813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.332901 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.332896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.333534 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.333511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.335260 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.335232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls\") pod \"ensemble-graph-5c921-c79bf4c85-pqgsq\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.377850 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.377804 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.494406 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.494374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:50:32.497615 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:50:32.497580 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeabddb_ba1b_4485_bb59_cdfc3f0e0791.slice/crio-93a94c34691c2abca15b2f64ae16ccb486cad9d95e2e7428bcafe47785d4af74 WatchSource:0}: Error finding container 93a94c34691c2abca15b2f64ae16ccb486cad9d95e2e7428bcafe47785d4af74: Status 404 returned error can't find the container with id 93a94c34691c2abca15b2f64ae16ccb486cad9d95e2e7428bcafe47785d4af74 Apr 24 22:50:32.499833 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.499818 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:50:32.620728 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.620699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" event={"ID":"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791","Type":"ContainerStarted","Data":"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758"} Apr 24 22:50:32.620728 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.620730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" event={"ID":"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791","Type":"ContainerStarted","Data":"93a94c34691c2abca15b2f64ae16ccb486cad9d95e2e7428bcafe47785d4af74"} Apr 24 22:50:32.620925 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.620835 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:32.637366 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:32.637315 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podStartSLOduration=0.637297266 podStartE2EDuration="637.297266ms" podCreationTimestamp="2026-04-24 22:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:32.636038998 +0000 UTC m=+1242.359631599" watchObservedRunningTime="2026-04-24 22:50:32.637297266 +0000 UTC m=+1242.360889837" Apr 24 22:50:34.128849 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:34.128817 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:38.630173 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:38.630146 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:39.128622 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:39.128586 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:41.901258 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:41.901230 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:50:42.013363 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.013323 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle\") pod \"c50b6da3-95f3-462c-96ea-3ccd90c93413\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " Apr 24 22:50:42.013530 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.013435 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls\") pod \"c50b6da3-95f3-462c-96ea-3ccd90c93413\" (UID: \"c50b6da3-95f3-462c-96ea-3ccd90c93413\") " Apr 24 22:50:42.013705 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.013682 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c50b6da3-95f3-462c-96ea-3ccd90c93413" (UID: "c50b6da3-95f3-462c-96ea-3ccd90c93413"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:42.015467 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.015432 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c50b6da3-95f3-462c-96ea-3ccd90c93413" (UID: "c50b6da3-95f3-462c-96ea-3ccd90c93413"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:42.114552 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.114508 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c50b6da3-95f3-462c-96ea-3ccd90c93413-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:50:42.114552 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.114543 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c50b6da3-95f3-462c-96ea-3ccd90c93413-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:50:42.126486 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.126455 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:50:42.126719 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.126682 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" containerID="cri-o://17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758" gracePeriod=30 Apr 24 22:50:42.653613 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.653577 2574 generic.go:358] "Generic (PLEG): container finished" podID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerID="31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845" exitCode=0 Apr 24 22:50:42.653777 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.653638 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" Apr 24 22:50:42.653777 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.653658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" event={"ID":"c50b6da3-95f3-462c-96ea-3ccd90c93413","Type":"ContainerDied","Data":"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845"} Apr 24 22:50:42.653777 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.653690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5" event={"ID":"c50b6da3-95f3-462c-96ea-3ccd90c93413","Type":"ContainerDied","Data":"88119f459fd65ab3093c0e950c280807eca9631208a549d718f4e129fd501c0a"} Apr 24 22:50:42.653777 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.653706 2574 scope.go:117] "RemoveContainer" containerID="31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845" Apr 24 22:50:42.662351 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.662331 2574 scope.go:117] "RemoveContainer" containerID="31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845" Apr 24 22:50:42.662594 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:50:42.662573 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845\": container with ID starting with 31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845 not found: ID does not exist" containerID="31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845" Apr 24 22:50:42.662660 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.662602 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845"} err="failed to get container status \"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845\": rpc error: code = NotFound desc = could not find container \"31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845\": container with ID starting with 31e0ac8bfe0e47a86881cbcce1ff27baa139ceba77a59bdb62922e5155f97845 not found: ID does not exist" Apr 24 22:50:42.674347 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.674321 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:50:42.678019 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.678000 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f8431-7f98d55687-f4dl5"] Apr 24 22:50:42.846099 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:42.846069 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" path="/var/lib/kubelet/pods/c50b6da3-95f3-462c-96ea-3ccd90c93413/volumes" Apr 24 22:50:43.628135 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:43.628102 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:48.628426 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:48.628385 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:53.627725 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:53.627688 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:50:53.628110 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:53.627802 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:50:58.627806 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:50:58.627769 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:03.627785 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:03.627744 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:08.627212 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:08.627177 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:11.948452 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.948419 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:11.948923 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.948729 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" Apr 24 22:51:11.948923 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.948739 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" Apr 24 22:51:11.948923 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.948807 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c50b6da3-95f3-462c-96ea-3ccd90c93413" containerName="sequence-graph-f8431" Apr 24 22:51:11.951518 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.951502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:11.953811 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.953788 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-32c7c-serving-cert\"" Apr 24 22:51:11.953935 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.953791 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-32c7c-kube-rbac-proxy-sar-config\"" Apr 24 22:51:11.957248 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:11.957227 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:12.061799 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.061763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.061978 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.061833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.162999 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.162946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.163133 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.163035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.163179 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.163163 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-32c7c-serving-cert: secret "sequence-graph-32c7c-serving-cert" not found Apr 24 22:51:12.163239 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.163230 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls podName:5159dba1-de60-46b3-92f2-5850e41fb850 nodeName:}" failed. No retries permitted until 2026-04-24 22:51:12.66321224 +0000 UTC m=+1282.386804792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls") pod "sequence-graph-32c7c-6ff7f8b44c-rwdm4" (UID: "5159dba1-de60-46b3-92f2-5850e41fb850") : secret "sequence-graph-32c7c-serving-cert" not found Apr 24 22:51:12.163570 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.163553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.185585 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.185550 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeabddb_ba1b_4485_bb59_cdfc3f0e0791.slice/crio-conmon-17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeabddb_ba1b_4485_bb59_cdfc3f0e0791.slice/crio-17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758.scope\": RecentStats: unable to find data in memory cache]" Apr 24 22:51:12.185690 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.185666 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeabddb_ba1b_4485_bb59_cdfc3f0e0791.slice/crio-conmon-17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeabddb_ba1b_4485_bb59_cdfc3f0e0791.slice/crio-17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758.scope\": RecentStats: unable to find data in memory cache]" Apr 24 22:51:12.306193 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.306172 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:51:12.364015 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.363990 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls\") pod \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " Apr 24 22:51:12.364149 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.364092 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle\") pod \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\" (UID: \"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791\") " Apr 24 22:51:12.364427 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.364403 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" (UID: "6eeabddb-ba1b-4485-bb59-cdfc3f0e0791"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:51:12.366070 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.366048 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" (UID: "6eeabddb-ba1b-4485-bb59-cdfc3f0e0791"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:51:12.465479 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.465453 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:51:12.465479 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.465475 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:51:12.667486 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.667454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:12.667637 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.667596 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-32c7c-serving-cert: secret "sequence-graph-32c7c-serving-cert" not found Apr 24 22:51:12.667683 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.667650 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls podName:5159dba1-de60-46b3-92f2-5850e41fb850 nodeName:}" failed. No retries permitted until 2026-04-24 22:51:13.667636876 +0000 UTC m=+1283.391229424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls") pod "sequence-graph-32c7c-6ff7f8b44c-rwdm4" (UID: "5159dba1-de60-46b3-92f2-5850e41fb850") : secret "sequence-graph-32c7c-serving-cert" not found Apr 24 22:51:12.742740 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.742706 2574 generic.go:358] "Generic (PLEG): container finished" podID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerID="17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758" exitCode=0 Apr 24 22:51:12.742905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.742776 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" Apr 24 22:51:12.742905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.742791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" event={"ID":"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791","Type":"ContainerDied","Data":"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758"} Apr 24 22:51:12.742905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.742835 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq" event={"ID":"6eeabddb-ba1b-4485-bb59-cdfc3f0e0791","Type":"ContainerDied","Data":"93a94c34691c2abca15b2f64ae16ccb486cad9d95e2e7428bcafe47785d4af74"} Apr 24 22:51:12.742905 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.742852 2574 scope.go:117] "RemoveContainer" containerID="17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758" Apr 24 22:51:12.750663 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.750646 2574 scope.go:117] "RemoveContainer" containerID="17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758" Apr 24 22:51:12.750922 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:12.750903 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758\": container with ID starting with 17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758 not found: ID does not exist" containerID="17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758" Apr 24 22:51:12.751007 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.750932 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758"} err="failed to get container status \"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758\": rpc error: code = NotFound desc = could not find container \"17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758\": container with ID starting with 17df19c7d5f9be7e22415a384b36ceb72201682d8ab57180e258e5f39c18e758 not found: ID does not exist" Apr 24 22:51:12.765109 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.765086 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:51:12.766389 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.766367 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-5c921-c79bf4c85-pqgsq"] Apr 24 22:51:12.848492 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:12.848460 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" path="/var/lib/kubelet/pods/6eeabddb-ba1b-4485-bb59-cdfc3f0e0791/volumes" Apr 24 22:51:13.674869 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:13.674829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:13.677350 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:13.677329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"sequence-graph-32c7c-6ff7f8b44c-rwdm4\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:13.761628 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:13.761600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:13.892848 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:13.892813 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:13.898524 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:51:13.898486 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5159dba1_de60_46b3_92f2_5850e41fb850.slice/crio-ad6cb584b03fbf4577369fca25c12170f19e9b5870459eee5edb1450d6b17f0f WatchSource:0}: Error finding container ad6cb584b03fbf4577369fca25c12170f19e9b5870459eee5edb1450d6b17f0f: Status 404 returned error can't find the container with id ad6cb584b03fbf4577369fca25c12170f19e9b5870459eee5edb1450d6b17f0f Apr 24 22:51:14.750765 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:14.750724 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" event={"ID":"5159dba1-de60-46b3-92f2-5850e41fb850","Type":"ContainerStarted","Data":"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916"} Apr 24 22:51:14.750765 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:14.750767 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" event={"ID":"5159dba1-de60-46b3-92f2-5850e41fb850","Type":"ContainerStarted","Data":"ad6cb584b03fbf4577369fca25c12170f19e9b5870459eee5edb1450d6b17f0f"} Apr 24 22:51:14.751206 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:14.750866 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:14.767763 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:14.767720 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podStartSLOduration=3.767705254 podStartE2EDuration="3.767705254s" podCreationTimestamp="2026-04-24 22:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:51:14.766108092 +0000 UTC m=+1284.489700662" watchObservedRunningTime="2026-04-24 22:51:14.767705254 +0000 UTC m=+1284.491297824" Apr 24 22:51:20.764312 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:20.764280 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:22.030275 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:22.030237 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:22.030630 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:22.030442 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" containerID="cri-o://08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916" gracePeriod=30 Apr 24 22:51:25.758511 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:25.758469 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:30.758474 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:30.758438 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:35.758191 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:35.758150 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:35.758575 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:35.758255 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:40.758405 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:40.758358 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:42.332694 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.332664 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 22:51:42.333088 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.332995 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" Apr 24 22:51:42.333088 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.333007 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" Apr 24 22:51:42.333088 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.333062 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6eeabddb-ba1b-4485-bb59-cdfc3f0e0791" containerName="ensemble-graph-5c921" Apr 24 22:51:42.335749 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.335725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:42.338232 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.338201 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4aba3-kube-rbac-proxy-sar-config\"" Apr 24 22:51:42.338326 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.338229 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4aba3-serving-cert\"" Apr 24 22:51:42.344542 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.344523 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 22:51:42.421436 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.421402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:42.421616 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.421449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:42.522225 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.522189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:42.522427 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.522245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:42.522427 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:42.522349 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-4aba3-serving-cert: secret "ensemble-graph-4aba3-serving-cert" not found Apr 24 22:51:42.522538 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:42.522445 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls podName:8c8ff629-b2a2-4834-9df1-58b60e87c2e6 nodeName:}" failed. No retries permitted until 2026-04-24 22:51:43.022419671 +0000 UTC m=+1312.746012233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls") pod "ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" (UID: "8c8ff629-b2a2-4834-9df1-58b60e87c2e6") : secret "ensemble-graph-4aba3-serving-cert" not found Apr 24 22:51:42.522820 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:42.522802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:43.026899 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.026864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:43.029276 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.029249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") pod \"ensemble-graph-4aba3-5f6b4f4ff7-vzzq7\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:43.245420 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.245383 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:43.361600 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.361563 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 22:51:43.364641 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:51:43.364604 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8ff629_b2a2_4834_9df1_58b60e87c2e6.slice/crio-271c6b17e7421b476c323f81803c137e361e981edfc998df8d3621802442f00d WatchSource:0}: Error finding container 271c6b17e7421b476c323f81803c137e361e981edfc998df8d3621802442f00d: Status 404 returned error can't find the container with id 271c6b17e7421b476c323f81803c137e361e981edfc998df8d3621802442f00d Apr 24 22:51:43.837076 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.837035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" event={"ID":"8c8ff629-b2a2-4834-9df1-58b60e87c2e6","Type":"ContainerStarted","Data":"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5"} Apr 24 22:51:43.837076 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.837078 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:43.837335 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.837087 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" event={"ID":"8c8ff629-b2a2-4834-9df1-58b60e87c2e6","Type":"ContainerStarted","Data":"271c6b17e7421b476c323f81803c137e361e981edfc998df8d3621802442f00d"} Apr 24 22:51:43.851354 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:43.851311 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podStartSLOduration=1.851296626 podStartE2EDuration="1.851296626s" podCreationTimestamp="2026-04-24 22:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:51:43.850842841 +0000 UTC m=+1313.574435408" watchObservedRunningTime="2026-04-24 22:51:43.851296626 +0000 UTC m=+1313.574889202" Apr 24 22:51:45.758231 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:45.758198 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:49.845367 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:49.845334 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 22:51:50.757833 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:50.757797 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:51:52.160642 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.160619 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:52.202437 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.202409 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") pod \"5159dba1-de60-46b3-92f2-5850e41fb850\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " Apr 24 22:51:52.202587 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.202498 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle\") pod \"5159dba1-de60-46b3-92f2-5850e41fb850\" (UID: \"5159dba1-de60-46b3-92f2-5850e41fb850\") " Apr 24 22:51:52.202836 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.202812 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5159dba1-de60-46b3-92f2-5850e41fb850" (UID: "5159dba1-de60-46b3-92f2-5850e41fb850"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:51:52.204407 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.204391 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5159dba1-de60-46b3-92f2-5850e41fb850" (UID: "5159dba1-de60-46b3-92f2-5850e41fb850"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:51:52.303006 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.302941 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5159dba1-de60-46b3-92f2-5850e41fb850-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:51:52.303006 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.302981 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5159dba1-de60-46b3-92f2-5850e41fb850-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 22:51:52.862037 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.862005 2574 generic.go:358] "Generic (PLEG): container finished" podID="5159dba1-de60-46b3-92f2-5850e41fb850" containerID="08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916" exitCode=0 Apr 24 22:51:52.862180 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.862066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" event={"ID":"5159dba1-de60-46b3-92f2-5850e41fb850","Type":"ContainerDied","Data":"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916"} Apr 24 22:51:52.862180 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.862074 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" Apr 24 22:51:52.862180 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.862093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4" event={"ID":"5159dba1-de60-46b3-92f2-5850e41fb850","Type":"ContainerDied","Data":"ad6cb584b03fbf4577369fca25c12170f19e9b5870459eee5edb1450d6b17f0f"} Apr 24 22:51:52.862180 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.862113 2574 scope.go:117] "RemoveContainer" containerID="08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916" Apr 24 22:51:52.870058 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.870040 2574 scope.go:117] "RemoveContainer" containerID="08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916" Apr 24 22:51:52.870309 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:51:52.870290 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916\": container with ID starting with 08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916 not found: ID does not exist" containerID="08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916" Apr 24 22:51:52.870356 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.870318 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916"} err="failed to get container status \"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916\": rpc error: code = NotFound desc = could not find container \"08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916\": container with ID starting with 08cad6bdc1d578871ec02825f268bb0f39d2aacb6c02d786d22f616cd396f916 not found: ID does not exist" Apr 24 22:51:52.877099 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.877068 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:52.878743 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:52.878720 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-32c7c-6ff7f8b44c-rwdm4"] Apr 24 22:51:54.846553 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:51:54.846515 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" path="/var/lib/kubelet/pods/5159dba1-de60-46b3-92f2-5850e41fb850/volumes" Apr 24 22:52:22.239973 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.239880 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 22:52:22.240445 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.240217 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" Apr 24 22:52:22.240445 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.240229 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" Apr 24 22:52:22.240445 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.240285 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5159dba1-de60-46b3-92f2-5850e41fb850" containerName="sequence-graph-32c7c" Apr 24 22:52:22.243313 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.243298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.245562 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.245541 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-2e6fe-kube-rbac-proxy-sar-config\"" Apr 24 22:52:22.245562 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.245552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-2e6fe-serving-cert\"" Apr 24 22:52:22.249490 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.249465 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 22:52:22.351519 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.351485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.351669 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.351540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.451979 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.451912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.452152 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.452038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.452152 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:52:22.452061 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-2e6fe-serving-cert: secret "sequence-graph-2e6fe-serving-cert" not found Apr 24 22:52:22.452152 ip-10-0-133-161 kubenswrapper[2574]: E0424 22:52:22.452120 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls podName:c8c55da2-39df-4339-8f89-142de9271828 nodeName:}" failed. No retries permitted until 2026-04-24 22:52:22.952104621 +0000 UTC m=+1352.675697168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls") pod "sequence-graph-2e6fe-dcb8bc6f8-mvk48" (UID: "c8c55da2-39df-4339-8f89-142de9271828") : secret "sequence-graph-2e6fe-serving-cert" not found Apr 24 22:52:22.452642 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.452621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.957634 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.957597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:22.960076 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:22.960048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") pod \"sequence-graph-2e6fe-dcb8bc6f8-mvk48\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:23.154046 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.153986 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:23.274005 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.273975 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 22:52:23.277188 ip-10-0-133-161 kubenswrapper[2574]: W0424 22:52:23.277153 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c55da2_39df_4339_8f89_142de9271828.slice/crio-7d1de485ff63a82d1b752e405383d85bac4253f1be1abbe5c80911e2c59ade51 WatchSource:0}: Error finding container 7d1de485ff63a82d1b752e405383d85bac4253f1be1abbe5c80911e2c59ade51: Status 404 returned error can't find the container with id 7d1de485ff63a82d1b752e405383d85bac4253f1be1abbe5c80911e2c59ade51 Apr 24 22:52:23.955113 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.955078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" event={"ID":"c8c55da2-39df-4339-8f89-142de9271828","Type":"ContainerStarted","Data":"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186"} Apr 24 22:52:23.955113 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.955110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" event={"ID":"c8c55da2-39df-4339-8f89-142de9271828","Type":"ContainerStarted","Data":"7d1de485ff63a82d1b752e405383d85bac4253f1be1abbe5c80911e2c59ade51"} Apr 24 22:52:23.955357 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.955135 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:52:23.972875 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:23.972835 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podStartSLOduration=1.972821766 podStartE2EDuration="1.972821766s" podCreationTimestamp="2026-04-24 22:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:52:23.970906144 +0000 UTC m=+1353.694498713" watchObservedRunningTime="2026-04-24 22:52:23.972821766 +0000 UTC m=+1353.696414335" Apr 24 22:52:29.963555 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:52:29.963522 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 22:54:50.866633 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:54:50.866601 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:54:50.868764 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:54:50.868743 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:59:50.887948 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:59:50.887919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:59:50.890974 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:59:50.890941 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 22:59:57.096733 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:59:57.096701 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 22:59:57.097222 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:59:57.096932 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" containerID="cri-o://4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5" gracePeriod=30 Apr 24 22:59:59.843858 ip-10-0-133-161 kubenswrapper[2574]: I0424 22:59:59.843817 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:04.843513 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:04.843475 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:09.843767 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:09.843725 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:09.844237 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:09.843849 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 23:00:14.843843 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:14.843804 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:19.844285 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:19.844246 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:24.844485 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:24.844440 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:27.235151 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.235124 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 23:00:27.250886 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.250860 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") pod \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " Apr 24 23:00:27.251038 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.250928 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle\") pod \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\" (UID: \"8c8ff629-b2a2-4834-9df1-58b60e87c2e6\") " Apr 24 23:00:27.251546 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.251519 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8c8ff629-b2a2-4834-9df1-58b60e87c2e6" (UID: "8c8ff629-b2a2-4834-9df1-58b60e87c2e6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:00:27.253356 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.253322 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8c8ff629-b2a2-4834-9df1-58b60e87c2e6" (UID: "8c8ff629-b2a2-4834-9df1-58b60e87c2e6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:00:27.352071 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.351992 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:00:27.352071 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.352017 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c8ff629-b2a2-4834-9df1-58b60e87c2e6-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:00:27.372778 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.372743 2574 generic.go:358] "Generic (PLEG): container finished" podID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerID="4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5" exitCode=0 Apr 24 23:00:27.372883 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.372804 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" Apr 24 23:00:27.372883 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.372813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" event={"ID":"8c8ff629-b2a2-4834-9df1-58b60e87c2e6","Type":"ContainerDied","Data":"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5"} Apr 24 23:00:27.372883 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.372840 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7" event={"ID":"8c8ff629-b2a2-4834-9df1-58b60e87c2e6","Type":"ContainerDied","Data":"271c6b17e7421b476c323f81803c137e361e981edfc998df8d3621802442f00d"} Apr 24 23:00:27.372883 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.372856 2574 scope.go:117] "RemoveContainer" containerID="4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5" Apr 24 23:00:27.381189 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.381165 2574 scope.go:117] "RemoveContainer" containerID="4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5" Apr 24 23:00:27.381480 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:00:27.381461 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5\": container with ID starting with 4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5 not found: ID does not exist" containerID="4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5" Apr 24 23:00:27.381533 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.381488 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5"} err="failed to get container status \"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5\": rpc error: code = NotFound desc = could not find container \"4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5\": container with ID starting with 4afdd2da92626e6c0b8b6b7109091eca04f3a255d801762de14ea79847e7ccf5 not found: ID does not exist" Apr 24 23:00:27.392865 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.392843 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 23:00:27.396152 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:27.396131 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4aba3-5f6b4f4ff7-vzzq7"] Apr 24 23:00:28.847150 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:28.847112 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" path="/var/lib/kubelet/pods/8c8ff629-b2a2-4834-9df1-58b60e87c2e6/volumes" Apr 24 23:00:36.981225 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:36.981194 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 23:00:36.981587 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:36.981429 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" containerID="cri-o://f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186" gracePeriod=30 Apr 24 23:00:39.962310 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:39.962274 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:44.962373 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:44.962328 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:49.962876 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:49.962837 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:49.963277 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:49.962953 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 23:00:54.963695 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:54.963656 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:00:57.342328 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.342296 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:00:57.342671 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.342630 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" Apr 24 23:00:57.342671 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.342642 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" Apr 24 23:00:57.342741 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.342705 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c8ff629-b2a2-4834-9df1-58b60e87c2e6" containerName="ensemble-graph-4aba3" Apr 24 23:00:57.345944 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.345929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:57.348138 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.348103 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-44b1e-kube-rbac-proxy-sar-config\"" Apr 24 23:00:57.348265 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.348234 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-44b1e-serving-cert\"" Apr 24 23:00:57.351624 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.351600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:00:57.401696 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.401673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:57.401813 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.401737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:57.502786 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.502758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:57.502929 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.502806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:57.502998 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:00:57.502955 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-44b1e-serving-cert: secret "splitter-graph-44b1e-serving-cert" not found Apr 24 23:00:57.503070 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:00:57.503058 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls podName:b0c9f0f2-6888-48a6-aec5-6cb581452bd7 nodeName:}" failed. No retries permitted until 2026-04-24 23:00:58.00303537 +0000 UTC m=+1867.726627919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls") pod "splitter-graph-44b1e-98dcdcd84-rr45w" (UID: "b0c9f0f2-6888-48a6-aec5-6cb581452bd7") : secret "splitter-graph-44b1e-serving-cert" not found Apr 24 23:00:57.503518 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:57.503501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:58.006826 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.006784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:58.009137 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.009108 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") pod \"splitter-graph-44b1e-98dcdcd84-rr45w\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:58.256376 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.256340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:58.380476 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.380436 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:00:58.383600 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:00:58.383574 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0c9f0f2_6888_48a6_aec5_6cb581452bd7.slice/crio-a779fbe5992e00f5b71ea7f0e448b78256ff2321921f687167c5b8252b0ec3d8 WatchSource:0}: Error finding container a779fbe5992e00f5b71ea7f0e448b78256ff2321921f687167c5b8252b0ec3d8: Status 404 returned error can't find the container with id a779fbe5992e00f5b71ea7f0e448b78256ff2321921f687167c5b8252b0ec3d8 Apr 24 23:00:58.385429 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.385409 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:00:58.466282 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.466250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" event={"ID":"b0c9f0f2-6888-48a6-aec5-6cb581452bd7","Type":"ContainerStarted","Data":"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312"} Apr 24 23:00:58.466382 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.466289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" event={"ID":"b0c9f0f2-6888-48a6-aec5-6cb581452bd7","Type":"ContainerStarted","Data":"a779fbe5992e00f5b71ea7f0e448b78256ff2321921f687167c5b8252b0ec3d8"} Apr 24 23:00:58.466382 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.466322 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:00:58.481833 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:58.481794 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podStartSLOduration=1.481781519 podStartE2EDuration="1.481781519s" podCreationTimestamp="2026-04-24 23:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:00:58.48107329 +0000 UTC m=+1868.204665873" watchObservedRunningTime="2026-04-24 23:00:58.481781519 +0000 UTC m=+1868.205374088" Apr 24 23:00:59.962115 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:00:59.962068 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:04.475386 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:04.475309 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:01:04.962525 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:04.962479 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:07.121384 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.121361 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 23:01:07.290440 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.290347 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle\") pod \"c8c55da2-39df-4339-8f89-142de9271828\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " Apr 24 23:01:07.290440 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.290437 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") pod \"c8c55da2-39df-4339-8f89-142de9271828\" (UID: \"c8c55da2-39df-4339-8f89-142de9271828\") " Apr 24 23:01:07.290714 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.290691 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c8c55da2-39df-4339-8f89-142de9271828" (UID: "c8c55da2-39df-4339-8f89-142de9271828"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:01:07.292526 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.292506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c8c55da2-39df-4339-8f89-142de9271828" (UID: "c8c55da2-39df-4339-8f89-142de9271828"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:01:07.391469 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.391436 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c55da2-39df-4339-8f89-142de9271828-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:01:07.391469 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.391463 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8c55da2-39df-4339-8f89-142de9271828-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:01:07.431763 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.431732 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:01:07.431953 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.431932 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" containerID="cri-o://6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312" gracePeriod=30 Apr 24 23:01:07.495090 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.495053 2574 generic.go:358] "Generic (PLEG): container finished" podID="c8c55da2-39df-4339-8f89-142de9271828" containerID="f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186" exitCode=0 Apr 24 23:01:07.495253 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.495112 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" Apr 24 23:01:07.495253 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.495145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" event={"ID":"c8c55da2-39df-4339-8f89-142de9271828","Type":"ContainerDied","Data":"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186"} Apr 24 23:01:07.495253 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.495185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48" event={"ID":"c8c55da2-39df-4339-8f89-142de9271828","Type":"ContainerDied","Data":"7d1de485ff63a82d1b752e405383d85bac4253f1be1abbe5c80911e2c59ade51"} Apr 24 23:01:07.495253 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.495205 2574 scope.go:117] "RemoveContainer" containerID="f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186" Apr 24 23:01:07.504673 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.504654 2574 scope.go:117] "RemoveContainer" containerID="f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186" Apr 24 23:01:07.504940 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:01:07.504919 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186\": container with ID starting with f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186 not found: ID does not exist" containerID="f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186" Apr 24 23:01:07.505034 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.504947 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186"} err="failed to get container status \"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186\": rpc error: code = NotFound desc = could not find container \"f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186\": container with ID starting with f16b0739de2157f59a0602d1231cdd8ec992597df8b200829bfff23305d2d186 not found: ID does not exist" Apr 24 23:01:07.524543 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.524511 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 23:01:07.525855 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:07.525829 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2e6fe-dcb8bc6f8-mvk48"] Apr 24 23:01:08.846237 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:08.846203 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c55da2-39df-4339-8f89-142de9271828" path="/var/lib/kubelet/pods/c8c55da2-39df-4339-8f89-142de9271828/volumes" Apr 24 23:01:09.473385 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:09.473344 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:14.473376 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:14.473337 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:19.473455 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:19.473411 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:19.473885 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:19.473510 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:01:24.473577 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:24.473537 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:29.474134 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:29.474096 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:34.474120 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:34.474081 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:01:37.193384 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.193350 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:01:37.193731 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.193665 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" Apr 24 23:01:37.193731 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.193677 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" Apr 24 23:01:37.193801 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.193742 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8c55da2-39df-4339-8f89-142de9271828" containerName="sequence-graph-2e6fe" Apr 24 23:01:37.196781 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.196765 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.199533 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.199513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d0532-serving-cert\"" Apr 24 23:01:37.199533 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.199543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d0532-kube-rbac-proxy-sar-config\"" Apr 24 23:01:37.203388 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.203366 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:01:37.231592 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.231565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.231697 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.231614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.332856 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.332826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.333010 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.332889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.333010 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:01:37.333000 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-d0532-serving-cert: secret "switch-graph-d0532-serving-cert" not found Apr 24 23:01:37.333083 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:01:37.333064 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls podName:88cfa972-2a64-4953-8cee-2b55422bc533 nodeName:}" failed. No retries permitted until 2026-04-24 23:01:37.833048651 +0000 UTC m=+1907.556641203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls") pod "switch-graph-d0532-75644c6c8f-ccgwj" (UID: "88cfa972-2a64-4953-8cee-2b55422bc533") : secret "switch-graph-d0532-serving-cert" not found Apr 24 23:01:37.333493 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.333473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.577366 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.577345 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:01:37.589230 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.589206 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerID="6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312" exitCode=0 Apr 24 23:01:37.589330 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.589260 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" Apr 24 23:01:37.589330 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.589288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" event={"ID":"b0c9f0f2-6888-48a6-aec5-6cb581452bd7","Type":"ContainerDied","Data":"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312"} Apr 24 23:01:37.589330 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.589319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w" event={"ID":"b0c9f0f2-6888-48a6-aec5-6cb581452bd7","Type":"ContainerDied","Data":"a779fbe5992e00f5b71ea7f0e448b78256ff2321921f687167c5b8252b0ec3d8"} Apr 24 23:01:37.589442 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.589335 2574 scope.go:117] "RemoveContainer" containerID="6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312" Apr 24 23:01:37.597945 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.597920 2574 scope.go:117] "RemoveContainer" containerID="6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312" Apr 24 23:01:37.598229 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:01:37.598211 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312\": container with ID starting with 6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312 not found: ID does not exist" containerID="6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312" Apr 24 23:01:37.598280 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.598239 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312"} err="failed to get container status \"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312\": rpc error: code = NotFound desc = could not find container \"6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312\": container with ID starting with 6e4d055e3ee616007fa357a2d1bb3e98cce3c4e92172a9b615302ed4ee783312 not found: ID does not exist" Apr 24 23:01:37.635692 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.635670 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle\") pod \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " Apr 24 23:01:37.635793 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.635729 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") pod \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\" (UID: \"b0c9f0f2-6888-48a6-aec5-6cb581452bd7\") " Apr 24 23:01:37.636022 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.636001 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b0c9f0f2-6888-48a6-aec5-6cb581452bd7" (UID: "b0c9f0f2-6888-48a6-aec5-6cb581452bd7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:01:37.637592 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.637574 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b0c9f0f2-6888-48a6-aec5-6cb581452bd7" (UID: "b0c9f0f2-6888-48a6-aec5-6cb581452bd7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:01:37.736521 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.736449 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:01:37.736521 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.736479 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0c9f0f2-6888-48a6-aec5-6cb581452bd7-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:01:37.836990 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.836946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.839417 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.839392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") pod \"switch-graph-d0532-75644c6c8f-ccgwj\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:37.912186 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.912152 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:01:37.917371 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:37.917348 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44b1e-98dcdcd84-rr45w"] Apr 24 23:01:38.108147 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.108114 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:38.221447 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.221425 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:01:38.224014 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:01:38.223982 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cfa972_2a64_4953_8cee_2b55422bc533.slice/crio-427ec38c522531db2b9b3ae8134bd4328dfa8ad3378b3b83d56b081ebe9d55ba WatchSource:0}: Error finding container 427ec38c522531db2b9b3ae8134bd4328dfa8ad3378b3b83d56b081ebe9d55ba: Status 404 returned error can't find the container with id 427ec38c522531db2b9b3ae8134bd4328dfa8ad3378b3b83d56b081ebe9d55ba Apr 24 23:01:38.594336 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.594247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" event={"ID":"88cfa972-2a64-4953-8cee-2b55422bc533","Type":"ContainerStarted","Data":"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36"} Apr 24 23:01:38.594336 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.594286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" event={"ID":"88cfa972-2a64-4953-8cee-2b55422bc533","Type":"ContainerStarted","Data":"427ec38c522531db2b9b3ae8134bd4328dfa8ad3378b3b83d56b081ebe9d55ba"} Apr 24 23:01:38.594521 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.594391 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:01:38.611051 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.611009 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podStartSLOduration=1.610997306 podStartE2EDuration="1.610997306s" podCreationTimestamp="2026-04-24 23:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:01:38.609711018 +0000 UTC m=+1908.333303601" watchObservedRunningTime="2026-04-24 23:01:38.610997306 +0000 UTC m=+1908.334589875" Apr 24 23:01:38.846185 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:38.846098 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" path="/var/lib/kubelet/pods/b0c9f0f2-6888-48a6-aec5-6cb581452bd7/volumes" Apr 24 23:01:44.602943 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:01:44.602912 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:02:07.616108 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.616073 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:02:07.616577 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.616536 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" Apr 24 23:02:07.616577 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.616556 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" Apr 24 23:02:07.616687 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.616659 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0c9f0f2-6888-48a6-aec5-6cb581452bd7" containerName="splitter-graph-44b1e" Apr 24 23:02:07.620928 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.620908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.623550 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.623524 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-be7a5-serving-cert\"" Apr 24 23:02:07.623893 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.623871 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-be7a5-kube-rbac-proxy-sar-config\"" Apr 24 23:02:07.626136 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.626114 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:02:07.783888 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.783848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.784071 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.783936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.885046 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.884946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.885222 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.885066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.885685 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.885660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.887328 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.887309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls\") pod \"splitter-graph-be7a5-64dbc87fdf-ddr2h\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:07.932199 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:07.932152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:08.049807 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:08.049780 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:02:08.051839 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:02:08.051805 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb88b2f1_c3bf_49c6_9641_c4dff86b9528.slice/crio-a778d16c2d0fc88dc975f9d9680e91984e2bf0fc9aee657eb018675ea1845355 WatchSource:0}: Error finding container a778d16c2d0fc88dc975f9d9680e91984e2bf0fc9aee657eb018675ea1845355: Status 404 returned error can't find the container with id a778d16c2d0fc88dc975f9d9680e91984e2bf0fc9aee657eb018675ea1845355 Apr 24 23:02:08.682056 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:08.682016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" event={"ID":"bb88b2f1-c3bf-49c6-9641-c4dff86b9528","Type":"ContainerStarted","Data":"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594"} Apr 24 23:02:08.682056 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:08.682060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" event={"ID":"bb88b2f1-c3bf-49c6-9641-c4dff86b9528","Type":"ContainerStarted","Data":"a778d16c2d0fc88dc975f9d9680e91984e2bf0fc9aee657eb018675ea1845355"} Apr 24 23:02:08.682472 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:08.682196 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:02:08.697522 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:08.697477 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podStartSLOduration=1.697464449 podStartE2EDuration="1.697464449s" podCreationTimestamp="2026-04-24 23:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:02:08.696622262 +0000 UTC m=+1938.420214832" watchObservedRunningTime="2026-04-24 23:02:08.697464449 +0000 UTC m=+1938.421057019" Apr 24 23:02:14.691307 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:02:14.691275 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:04:50.911982 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:04:50.911939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:04:50.914390 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:04:50.914040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:09:50.932434 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:09:50.932327 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:09:50.936745 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:09:50.936721 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:10:22.246299 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:22.246225 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:10:22.246736 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:22.246450 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" containerID="cri-o://c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594" gracePeriod=30 Apr 24 23:10:24.689983 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:24.689934 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:29.689642 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:29.689605 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:34.689594 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:34.689558 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:34.689940 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:34.689677 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:10:39.689463 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:39.689430 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:44.689877 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:44.689837 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:49.689709 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:49.689668 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:10:52.380132 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.380110 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:10:52.529474 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.529382 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle\") pod \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " Apr 24 23:10:52.529474 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.529433 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls\") pod \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\" (UID: \"bb88b2f1-c3bf-49c6-9641-c4dff86b9528\") " Apr 24 23:10:52.529736 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.529713 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bb88b2f1-c3bf-49c6-9641-c4dff86b9528" (UID: "bb88b2f1-c3bf-49c6-9641-c4dff86b9528"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:10:52.531516 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.531498 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb88b2f1-c3bf-49c6-9641-c4dff86b9528" (UID: "bb88b2f1-c3bf-49c6-9641-c4dff86b9528"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:10:52.630670 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.630613 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:10:52.630670 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:52.630669 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb88b2f1-c3bf-49c6-9641-c4dff86b9528-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:10:53.206596 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.206560 2574 generic.go:358] "Generic (PLEG): container finished" podID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerID="c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594" exitCode=0 Apr 24 23:10:53.206765 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.206647 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" Apr 24 23:10:53.206765 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.206652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" event={"ID":"bb88b2f1-c3bf-49c6-9641-c4dff86b9528","Type":"ContainerDied","Data":"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594"} Apr 24 23:10:53.206765 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.206687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h" event={"ID":"bb88b2f1-c3bf-49c6-9641-c4dff86b9528","Type":"ContainerDied","Data":"a778d16c2d0fc88dc975f9d9680e91984e2bf0fc9aee657eb018675ea1845355"} Apr 24 23:10:53.206765 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.206704 2574 scope.go:117] "RemoveContainer" containerID="c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594" Apr 24 23:10:53.214548 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.214536 2574 scope.go:117] "RemoveContainer" containerID="c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594" Apr 24 23:10:53.214775 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:10:53.214758 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594\": container with ID starting with c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594 not found: ID does not exist" containerID="c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594" Apr 24 23:10:53.214835 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.214781 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594"} err="failed to get container status \"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594\": rpc error: code = NotFound desc = could not find container \"c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594\": container with ID starting with c0556dd23347518a9349efb6a2e4cad07f7157cb08c0e9573903081f7deca594 not found: ID does not exist" Apr 24 23:10:53.221173 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.221154 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:10:53.224846 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:53.224825 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-be7a5-64dbc87fdf-ddr2h"] Apr 24 23:10:54.846619 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:10:54.846578 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" path="/var/lib/kubelet/pods/bb88b2f1-c3bf-49c6-9641-c4dff86b9528/volumes" Apr 24 23:14:50.953087 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:14:50.952951 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:14:50.958679 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:14:50.958656 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:17:56.680727 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:56.680647 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:17:56.681223 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:56.680867 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" containerID="cri-o://382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36" gracePeriod=30 Apr 24 23:17:58.190281 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.190248 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvdqg/must-gather-c7ldc"] Apr 24 23:17:58.190660 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.190592 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" Apr 24 23:17:58.190660 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.190604 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" Apr 24 23:17:58.190660 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.190658 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88b2f1-c3bf-49c6-9641-c4dff86b9528" containerName="splitter-graph-be7a5" Apr 24 23:17:58.193810 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.193792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.196333 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.196304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdqg\"/\"openshift-service-ca.crt\"" Apr 24 23:17:58.196447 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.196345 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gvdqg\"/\"default-dockercfg-8xzc9\"" Apr 24 23:17:58.196447 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.196406 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdqg\"/\"kube-root-ca.crt\"" Apr 24 23:17:58.207470 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.207450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdqg/must-gather-c7ldc"] Apr 24 23:17:58.273520 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.273485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jcj\" (UniqueName: \"kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.273682 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.273554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.374956 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.374916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jcj\" (UniqueName: \"kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.375141 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.375014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.375341 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.375324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.382433 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.382406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jcj\" (UniqueName: \"kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj\") pod \"must-gather-c7ldc\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.517595 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.517518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:17:58.637479 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.637456 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdqg/must-gather-c7ldc"] Apr 24 23:17:58.640068 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:17:58.640041 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dd851af_6887_427f_9ac6_48a6b0ded5b5.slice/crio-6de141c2e80f358038c7cbab698963640286f26c3a239711ee8aa244dbe342e0 WatchSource:0}: Error finding container 6de141c2e80f358038c7cbab698963640286f26c3a239711ee8aa244dbe342e0: Status 404 returned error can't find the container with id 6de141c2e80f358038c7cbab698963640286f26c3a239711ee8aa244dbe342e0 Apr 24 23:17:58.641807 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:58.641791 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:17:59.437996 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:59.437946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" event={"ID":"0dd851af-6887-427f-9ac6-48a6b0ded5b5","Type":"ContainerStarted","Data":"6de141c2e80f358038c7cbab698963640286f26c3a239711ee8aa244dbe342e0"} Apr 24 23:17:59.603467 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:17:59.603423 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:03.453843 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:03.453805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" event={"ID":"0dd851af-6887-427f-9ac6-48a6b0ded5b5","Type":"ContainerStarted","Data":"79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42"} Apr 24 23:18:03.453843 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:03.453847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" event={"ID":"0dd851af-6887-427f-9ac6-48a6b0ded5b5","Type":"ContainerStarted","Data":"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e"} Apr 24 23:18:03.469694 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:03.469640 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" podStartSLOduration=1.488518204 podStartE2EDuration="5.46962803s" podCreationTimestamp="2026-04-24 23:17:58 +0000 UTC" firstStartedPulling="2026-04-24 23:17:58.64192258 +0000 UTC m=+2888.365515132" lastFinishedPulling="2026-04-24 23:18:02.62303241 +0000 UTC m=+2892.346624958" observedRunningTime="2026-04-24 23:18:03.468944106 +0000 UTC m=+2893.192536712" watchObservedRunningTime="2026-04-24 23:18:03.46962803 +0000 UTC m=+2893.193220599" Apr 24 23:18:04.601557 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:04.601518 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:09.601940 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:09.601894 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:09.602590 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:09.602567 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:18:11.434167 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:11.434131 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:12.255123 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:12.255092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:13.043993 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:13.043950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:13.812588 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:13.812562 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:14.593003 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:14.592972 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:14.601263 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:14.601234 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:15.378495 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:15.378470 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:16.155541 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:16.155515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:16.961914 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:16.961884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:17.753712 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:17.753681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:18.563552 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:18.563524 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:19.362482 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:19.362451 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:19.601043 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:19.601008 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:20.181583 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:20.181555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-d0532-75644c6c8f-ccgwj_88cfa972-2a64-4953-8cee-2b55422bc533/switch-graph-d0532/0.log" Apr 24 23:18:21.512296 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:21.512220 2574 generic.go:358] "Generic (PLEG): container finished" podID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerID="cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e" exitCode=0 Apr 24 23:18:21.512296 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:21.512259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" event={"ID":"0dd851af-6887-427f-9ac6-48a6b0ded5b5","Type":"ContainerDied","Data":"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e"} Apr 24 23:18:21.512698 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:21.512576 2574 scope.go:117] "RemoveContainer" containerID="cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e" Apr 24 23:18:22.193581 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.193557 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvdqg_must-gather-c7ldc_0dd851af-6887-427f-9ac6-48a6b0ded5b5/gather/0.log" Apr 24 23:18:22.792291 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.792258 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jpd6k/must-gather-jr66p"] Apr 24 23:18:22.795738 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.795723 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.798277 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.798257 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"openshift-service-ca.crt\"" Apr 24 23:18:22.798401 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.798297 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"kube-root-ca.crt\"" Apr 24 23:18:22.799270 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.799252 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jpd6k\"/\"default-dockercfg-fs845\"" Apr 24 23:18:22.803004 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.802981 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/must-gather-jr66p"] Apr 24 23:18:22.887189 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.887149 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47ae230d-a577-4729-a86c-1aa4abd5f431-must-gather-output\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.887394 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.887256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vtzg\" (UniqueName: \"kubernetes.io/projected/47ae230d-a577-4729-a86c-1aa4abd5f431-kube-api-access-7vtzg\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.987950 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.987921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47ae230d-a577-4729-a86c-1aa4abd5f431-must-gather-output\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.988091 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.987989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vtzg\" (UniqueName: \"kubernetes.io/projected/47ae230d-a577-4729-a86c-1aa4abd5f431-kube-api-access-7vtzg\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.988297 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.988274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47ae230d-a577-4729-a86c-1aa4abd5f431-must-gather-output\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:22.996104 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:22.996084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vtzg\" (UniqueName: \"kubernetes.io/projected/47ae230d-a577-4729-a86c-1aa4abd5f431-kube-api-access-7vtzg\") pod \"must-gather-jr66p\" (UID: \"47ae230d-a577-4729-a86c-1aa4abd5f431\") " pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:23.105071 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:23.105043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/must-gather-jr66p" Apr 24 23:18:23.221356 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:23.221201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/must-gather-jr66p"] Apr 24 23:18:23.223922 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:18:23.223893 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ae230d_a577_4729_a86c_1aa4abd5f431.slice/crio-ef32c9d6cc502874c580da2298dbf570c0b5782424c4727e3644f60b5ccfe4ac WatchSource:0}: Error finding container ef32c9d6cc502874c580da2298dbf570c0b5782424c4727e3644f60b5ccfe4ac: Status 404 returned error can't find the container with id ef32c9d6cc502874c580da2298dbf570c0b5782424c4727e3644f60b5ccfe4ac Apr 24 23:18:23.520164 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:23.520072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/must-gather-jr66p" event={"ID":"47ae230d-a577-4729-a86c-1aa4abd5f431","Type":"ContainerStarted","Data":"ef32c9d6cc502874c580da2298dbf570c0b5782424c4727e3644f60b5ccfe4ac"} Apr 24 23:18:24.524979 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:24.524930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/must-gather-jr66p" event={"ID":"47ae230d-a577-4729-a86c-1aa4abd5f431","Type":"ContainerStarted","Data":"61f1e86d1e460e60d1170b4942c29a5332f1062dbb1bea7af8e6d3363999724a"} Apr 24 23:18:24.525359 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:24.524989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/must-gather-jr66p" event={"ID":"47ae230d-a577-4729-a86c-1aa4abd5f431","Type":"ContainerStarted","Data":"0743b449a22773deb056ed2147800a4de2983b97a6b65311606d7dafa6ed9979"} Apr 24 23:18:24.541447 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:24.541395 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jpd6k/must-gather-jr66p" podStartSLOduration=1.695215717 podStartE2EDuration="2.541381154s" podCreationTimestamp="2026-04-24 23:18:22 +0000 UTC" firstStartedPulling="2026-04-24 23:18:23.225522466 +0000 UTC m=+2912.949115014" lastFinishedPulling="2026-04-24 23:18:24.071687903 +0000 UTC m=+2913.795280451" observedRunningTime="2026-04-24 23:18:24.539077844 +0000 UTC m=+2914.262670414" watchObservedRunningTime="2026-04-24 23:18:24.541381154 +0000 UTC m=+2914.264973766" Apr 24 23:18:24.601357 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:24.601283 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:18:25.480881 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:25.480853 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-crl79_53fd54b3-2948-44d4-9d51-6c630d4b0a08/global-pull-secret-syncer/0.log" Apr 24 23:18:25.566294 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:25.566261 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l24ds_23caaf68-bf31-4bd4-8417-c00b22900ee2/konnectivity-agent/0.log" Apr 24 23:18:25.675046 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:25.675013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-161.ec2.internal_71c8bd7a095056144ad8091ca3c68103/haproxy/0.log" Apr 24 23:18:27.457443 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.457036 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:18:27.535389 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.535277 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") pod \"88cfa972-2a64-4953-8cee-2b55422bc533\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " Apr 24 23:18:27.535389 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.535350 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle\") pod \"88cfa972-2a64-4953-8cee-2b55422bc533\" (UID: \"88cfa972-2a64-4953-8cee-2b55422bc533\") " Apr 24 23:18:27.536484 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.536385 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "88cfa972-2a64-4953-8cee-2b55422bc533" (UID: "88cfa972-2a64-4953-8cee-2b55422bc533"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:18:27.543391 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.542756 2574 generic.go:358] "Generic (PLEG): container finished" podID="88cfa972-2a64-4953-8cee-2b55422bc533" containerID="382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36" exitCode=0 Apr 24 23:18:27.543391 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.542873 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" event={"ID":"88cfa972-2a64-4953-8cee-2b55422bc533","Type":"ContainerDied","Data":"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36"} Apr 24 23:18:27.543391 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.542903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" event={"ID":"88cfa972-2a64-4953-8cee-2b55422bc533","Type":"ContainerDied","Data":"427ec38c522531db2b9b3ae8134bd4328dfa8ad3378b3b83d56b081ebe9d55ba"} Apr 24 23:18:27.543391 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.542925 2574 scope.go:117] "RemoveContainer" containerID="382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36" Apr 24 23:18:27.543391 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.543096 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj" Apr 24 23:18:27.553338 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.553269 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "88cfa972-2a64-4953-8cee-2b55422bc533" (UID: "88cfa972-2a64-4953-8cee-2b55422bc533"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:18:27.566711 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.566644 2574 scope.go:117] "RemoveContainer" containerID="382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36" Apr 24 23:18:27.567232 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:18:27.567152 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36\": container with ID starting with 382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36 not found: ID does not exist" containerID="382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36" Apr 24 23:18:27.567232 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.567187 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36"} err="failed to get container status \"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36\": rpc error: code = NotFound desc = could not find container \"382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36\": container with ID starting with 382a7b523c01edfa6d27a7d37c127062f9121e49481687c5152864096fb07d36 not found: ID does not exist" Apr 24 23:18:27.620903 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.619593 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gvdqg/must-gather-c7ldc"] Apr 24 23:18:27.620903 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.619878 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="copy" containerID="cri-o://79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42" gracePeriod=2 Apr 24 23:18:27.626934 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.626747 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gvdqg/must-gather-c7ldc"] Apr 24 23:18:27.626934 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.626864 2574 status_manager.go:895] "Failed to get status for pod" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" err="pods \"must-gather-c7ldc\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvdqg\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 24 23:18:27.641886 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.641821 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88cfa972-2a64-4953-8cee-2b55422bc533-openshift-service-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:18:27.641886 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.641858 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88cfa972-2a64-4953-8cee-2b55422bc533-proxy-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:18:27.860071 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.859340 2574 status_manager.go:895] "Failed to get status for pod" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" err="pods \"must-gather-c7ldc\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvdqg\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 24 23:18:27.874983 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.871754 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:18:27.878360 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.878305 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d0532-75644c6c8f-ccgwj"] Apr 24 23:18:27.969506 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.968704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvdqg_must-gather-c7ldc_0dd851af-6887-427f-9ac6-48a6b0ded5b5/copy/0.log" Apr 24 23:18:27.969506 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.969165 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:18:27.971229 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:27.971186 2574 status_manager.go:895] "Failed to get status for pod" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" err="pods \"must-gather-c7ldc\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvdqg\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 24 23:18:28.050406 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.049710 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output\") pod \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " Apr 24 23:18:28.050406 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.049800 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8jcj\" (UniqueName: \"kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj\") pod \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\" (UID: \"0dd851af-6887-427f-9ac6-48a6b0ded5b5\") " Apr 24 23:18:28.052901 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.052851 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj" (OuterVolumeSpecName: "kube-api-access-m8jcj") pod "0dd851af-6887-427f-9ac6-48a6b0ded5b5" (UID: "0dd851af-6887-427f-9ac6-48a6b0ded5b5"). InnerVolumeSpecName "kube-api-access-m8jcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:18:28.062025 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.061942 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0dd851af-6887-427f-9ac6-48a6b0ded5b5" (UID: "0dd851af-6887-427f-9ac6-48a6b0ded5b5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:18:28.151403 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.151326 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0dd851af-6887-427f-9ac6-48a6b0ded5b5-must-gather-output\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:18:28.151403 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.151365 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8jcj\" (UniqueName: \"kubernetes.io/projected/0dd851af-6887-427f-9ac6-48a6b0ded5b5-kube-api-access-m8jcj\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 24 23:18:28.549336 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.549275 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvdqg_must-gather-c7ldc_0dd851af-6887-427f-9ac6-48a6b0ded5b5/copy/0.log" Apr 24 23:18:28.550530 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.550483 2574 generic.go:358] "Generic (PLEG): container finished" podID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerID="79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42" exitCode=143 Apr 24 23:18:28.550884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.550753 2574 scope.go:117] "RemoveContainer" containerID="79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42" Apr 24 23:18:28.551267 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.551006 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" Apr 24 23:18:28.555491 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.555450 2574 status_manager.go:895] "Failed to get status for pod" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" err="pods \"must-gather-c7ldc\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvdqg\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 24 23:18:28.570471 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.569120 2574 scope.go:117] "RemoveContainer" containerID="cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e" Apr 24 23:18:28.604440 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.603425 2574 scope.go:117] "RemoveContainer" containerID="79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42" Apr 24 23:18:28.607002 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:18:28.606764 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42\": container with ID starting with 79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42 not found: ID does not exist" containerID="79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42" Apr 24 23:18:28.607002 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.606821 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42"} err="failed to get container status \"79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42\": rpc error: code = NotFound desc = could not find container \"79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42\": container with ID starting with 79ee00dcb0edce722df805e548ddce0b078132ee934bb9e2847f492920287a42 not found: ID does not exist" Apr 24 23:18:28.607002 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.606850 2574 scope.go:117] "RemoveContainer" containerID="cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e" Apr 24 23:18:28.610916 ip-10-0-133-161 kubenswrapper[2574]: E0424 23:18:28.610893 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e\": container with ID starting with cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e not found: ID does not exist" containerID="cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e" Apr 24 23:18:28.611127 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.611103 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e"} err="failed to get container status \"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e\": rpc error: code = NotFound desc = could not find container \"cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e\": container with ID starting with cb0fab35511a77e12ce08d6dd566417eb2c807ba6ba6ffee04a148b47e1abd0e not found: ID does not exist" Apr 24 23:18:28.616988 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.614919 2574 status_manager.go:895] "Failed to get status for pod" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" pod="openshift-must-gather-gvdqg/must-gather-c7ldc" err="pods \"must-gather-c7ldc\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvdqg\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 24 23:18:28.848832 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.848755 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" path="/var/lib/kubelet/pods/0dd851af-6887-427f-9ac6-48a6b0ded5b5/volumes" Apr 24 23:18:28.849636 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:28.849611 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" path="/var/lib/kubelet/pods/88cfa972-2a64-4953-8cee-2b55422bc533/volumes" Apr 24 23:18:29.479340 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.479309 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8d5c47b75-qtv9m_c25776a2-f338-47e3-a66f-0dbbe5007841/metrics-server/0.log" Apr 24 23:18:29.503498 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.503458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-k5srd_7d231644-f466-44c6-8279-d34341dcfa89/monitoring-plugin/0.log" Apr 24 23:18:29.601547 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.601478 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2lwr_c85adc68-9477-4b3e-b89a-43fb8defbdbb/node-exporter/0.log" Apr 24 23:18:29.626486 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.626460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2lwr_c85adc68-9477-4b3e-b89a-43fb8defbdbb/kube-rbac-proxy/0.log" Apr 24 23:18:29.651730 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.651697 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2lwr_c85adc68-9477-4b3e-b89a-43fb8defbdbb/init-textfile/0.log" Apr 24 23:18:29.752817 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.752737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7p8rc_96dd1a1c-c821-4172-b742-661cf436945c/kube-rbac-proxy-main/0.log" Apr 24 23:18:29.776303 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.776267 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7p8rc_96dd1a1c-c821-4172-b742-661cf436945c/kube-rbac-proxy-self/0.log" Apr 24 23:18:29.798047 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.798014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7p8rc_96dd1a1c-c821-4172-b742-661cf436945c/openshift-state-metrics/0.log" Apr 24 23:18:29.850385 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.850355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/prometheus/0.log" Apr 24 23:18:29.870011 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.869986 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/config-reloader/0.log" Apr 24 23:18:29.891689 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.891587 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/thanos-sidecar/0.log" Apr 24 23:18:29.914355 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.914330 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/kube-rbac-proxy-web/0.log" Apr 24 23:18:29.940789 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.940755 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/kube-rbac-proxy/0.log" Apr 24 23:18:29.963975 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.963925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/kube-rbac-proxy-thanos/0.log" Apr 24 23:18:29.987318 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:29.987283 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c74ab500-f85a-4bfe-8de4-08bb4710461e/init-config-reloader/0.log" Apr 24 23:18:30.019379 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:30.019301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gdzpg_c9e17516-d534-43af-9540-9a890f73e5f2/prometheus-operator/0.log" Apr 24 23:18:30.039462 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:30.039424 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gdzpg_c9e17516-d534-43af-9540-9a890f73e5f2/kube-rbac-proxy/0.log" Apr 24 23:18:30.095524 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:30.095491 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-576b4d6588-xn2zj_87e464d4-7566-453e-bc71-f3a661e74c9d/telemeter-client/0.log" Apr 24 23:18:30.117506 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:30.117474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-576b4d6588-xn2zj_87e464d4-7566-453e-bc71-f3a661e74c9d/reload/0.log" Apr 24 23:18:30.139109 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:30.139078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-576b4d6588-xn2zj_87e464d4-7566-453e-bc71-f3a661e74c9d/kube-rbac-proxy/0.log" Apr 24 23:18:31.945890 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:31.945866 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/1.log" Apr 24 23:18:31.951581 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:31.951556 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6z9wv_2ae54e6c-a291-4b2e-8885-ba0d08f9048c/console-operator/2.log" Apr 24 23:18:32.329473 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:32.329400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84867b96fc-rbb57_5a693059-25b8-4d4d-89e9-694a049e62a0/console/0.log" Apr 24 23:18:32.363895 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:32.363868 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hcwpm_584c5d46-6f2a-4b43-a0d2-132c52dd9409/download-server/0.log" Apr 24 23:18:33.000068 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000033 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5"] Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000341 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="copy" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000351 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="copy" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000364 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000369 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000378 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="gather" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000383 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="gather" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000436 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="copy" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000445 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="88cfa972-2a64-4953-8cee-2b55422bc533" containerName="switch-graph-d0532" Apr 24 23:18:33.000542 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.000453 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dd851af-6887-427f-9ac6-48a6b0ded5b5" containerName="gather" Apr 24 23:18:33.004743 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.004722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.010842 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.010818 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5"] Apr 24 23:18:33.097984 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.097910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-lib-modules\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.098177 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.097990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-proc\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.098177 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.098071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brd7\" (UniqueName: \"kubernetes.io/projected/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-kube-api-access-4brd7\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.098177 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.098108 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-sys\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.098177 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.098147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-podres\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199623 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-lib-modules\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199623 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-proc\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4brd7\" (UniqueName: \"kubernetes.io/projected/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-kube-api-access-4brd7\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-sys\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-podres\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-proc\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-sys\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-lib-modules\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.199884 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.199811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-podres\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.207801 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.207773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brd7\" (UniqueName: \"kubernetes.io/projected/ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f-kube-api-access-4brd7\") pod \"perf-node-gather-daemonset-n8kr5\" (UID: \"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.317419 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.317331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.459855 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.459829 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5"] Apr 24 23:18:33.462869 ip-10-0-133-161 kubenswrapper[2574]: W0424 23:18:33.462825 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca94cbcb_fd89_4699_bf4a_cddf1c01cc6f.slice/crio-82bcd349c4ad4b1c6ec2dd4996659bdecd623f2f2d54f28e3dc9834f0bbf38cf WatchSource:0}: Error finding container 82bcd349c4ad4b1c6ec2dd4996659bdecd623f2f2d54f28e3dc9834f0bbf38cf: Status 404 returned error can't find the container with id 82bcd349c4ad4b1c6ec2dd4996659bdecd623f2f2d54f28e3dc9834f0bbf38cf Apr 24 23:18:33.511732 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.511715 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt62l_8a25583f-2bb0-4f33-93bd-3e30aec48cee/dns/0.log" Apr 24 23:18:33.532581 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.532559 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt62l_8a25583f-2bb0-4f33-93bd-3e30aec48cee/kube-rbac-proxy/0.log" Apr 24 23:18:33.583572 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.583517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" event={"ID":"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f","Type":"ContainerStarted","Data":"1c4f8da9652adb7548b2d0ef7b91599a69a5f2a48ef6a9b12baec014c7c2a046"} Apr 24 23:18:33.583572 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.583548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" event={"ID":"ca94cbcb-fd89-4699-bf4a-cddf1c01cc6f","Type":"ContainerStarted","Data":"82bcd349c4ad4b1c6ec2dd4996659bdecd623f2f2d54f28e3dc9834f0bbf38cf"} Apr 24 23:18:33.583700 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.583656 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:33.597590 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.597557 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" podStartSLOduration=1.597543971 podStartE2EDuration="1.597543971s" podCreationTimestamp="2026-04-24 23:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:18:33.596450867 +0000 UTC m=+2923.320043436" watchObservedRunningTime="2026-04-24 23:18:33.597543971 +0000 UTC m=+2923.321136541" Apr 24 23:18:33.619226 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:33.619201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-knznw_8acc4f4f-831f-4c10-a187-01230734276e/dns-node-resolver/0.log" Apr 24 23:18:34.096068 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:34.096043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xx2md_939c10b8-9c56-4502-a65a-30206c40fa9d/node-ca/0.log" Apr 24 23:18:35.170163 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.170106 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vffmw_8eff2e0b-1c86-4882-8b55-3fb02eb38ebe/serve-healthcheck-canary/0.log" Apr 24 23:18:35.516115 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.516040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-m7xn9_e0595875-db43-466a-aa45-15f3138253c4/insights-operator/0.log" Apr 24 23:18:35.517040 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.517021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-m7xn9_e0595875-db43-466a-aa45-15f3138253c4/insights-operator/1.log" Apr 24 23:18:35.653565 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.653534 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qmpbb_31f36569-beca-46b9-b9fd-88f0cc0404d1/kube-rbac-proxy/0.log" Apr 24 23:18:35.675245 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.675216 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qmpbb_31f36569-beca-46b9-b9fd-88f0cc0404d1/exporter/0.log" Apr 24 23:18:35.720789 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:35.720766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qmpbb_31f36569-beca-46b9-b9fd-88f0cc0404d1/extractor/0.log" Apr 24 23:18:38.012935 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:38.012904 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-96ngm_8ac16476-4c74-4b62-b80e-11e6fc65bd17/s3-init/0.log" Apr 24 23:18:39.599117 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:39.599080 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-n8kr5" Apr 24 23:18:41.723758 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:41.723656 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg92_63b4d91d-1df3-49ed-9d57-7a0ec28ca165/migrator/0.log" Apr 24 23:18:41.745150 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:41.745125 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg92_63b4d91d-1df3-49ed-9d57-7a0ec28ca165/graceful-termination/0.log" Apr 24 23:18:43.110508 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.110481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/kube-multus-additional-cni-plugins/0.log" Apr 24 23:18:43.131815 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.131791 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/egress-router-binary-copy/0.log" Apr 24 23:18:43.157934 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.157912 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/cni-plugins/0.log" Apr 24 23:18:43.181486 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.181465 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/bond-cni-plugin/0.log" Apr 24 23:18:43.208316 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.208296 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/routeoverride-cni/0.log" Apr 24 23:18:43.231654 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.231632 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/whereabouts-cni-bincopy/0.log" Apr 24 23:18:43.254805 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.254775 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7rncr_ca9127c4-a533-44bc-9593-d1308d3b463f/whereabouts-cni/0.log" Apr 24 23:18:43.773063 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.773025 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppjh8_eebbd623-0913-43a3-ad50-13184bd5baaa/kube-multus/0.log" Apr 24 23:18:43.836989 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.836941 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8ztg8_d0c4cf71-fe26-4a65-bc22-b98bb5827d73/network-metrics-daemon/0.log" Apr 24 23:18:43.856657 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:43.856631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8ztg8_d0c4cf71-fe26-4a65-bc22-b98bb5827d73/kube-rbac-proxy/0.log" Apr 24 23:18:44.958306 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:44.958280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/ovn-controller/0.log" Apr 24 23:18:44.988679 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:44.988649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/ovn-acl-logging/0.log" Apr 24 23:18:45.008012 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.007971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/kube-rbac-proxy-node/0.log" Apr 24 23:18:45.029133 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.029096 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 23:18:45.052890 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.052867 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/northd/0.log" Apr 24 23:18:45.072926 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.072903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/nbdb/0.log" Apr 24 23:18:45.093265 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.093248 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/sbdb/0.log" Apr 24 23:18:45.206423 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:45.206389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b4zlm_ce4f31c6-c297-4d19-b0f3-f05c45c17a8e/ovnkube-controller/0.log" Apr 24 23:18:46.490932 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:46.490902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rscxl_f6f16e94-fa06-4d71-bfc7-e6b272e8d1bc/network-check-target-container/0.log" Apr 24 23:18:47.430124 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:47.430087 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xj45v_319510cc-198d-4518-b16b-5a7c26091db0/iptables-alerter/0.log" Apr 24 23:18:48.075431 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:48.075401 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-klbmp_ea8f7937-1cbc-424c-ba7e-220e1d538dbe/tuned/0.log" Apr 24 23:18:49.858715 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:49.858684 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-zq5vs_7c99265f-97ef-4683-8d9c-f7c17dd3f1ec/cluster-samples-operator/0.log" Apr 24 23:18:49.877000 ip-10-0-133-161 kubenswrapper[2574]: I0424 23:18:49.876974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-zq5vs_7c99265f-97ef-4683-8d9c-f7c17dd3f1ec/cluster-samples-operator-watch/0.log"