Apr 21 07:51:18.465928 ip-10-0-130-176 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.979017 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982458 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982474 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982478 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982482 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982486 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982491 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982495 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:19.026455 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982499 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982503 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982506 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982519 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982523 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982527 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982531 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982534 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982538 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982542 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982545 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982549 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982552 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982556 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982560 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982564 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982567 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982571 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982575 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982579 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:19.027697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982583 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982587 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982590 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982594 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982598 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982601 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982605 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982609 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982614 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982617 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982622 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982626 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982629 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982633 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982637 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982640 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982644 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982647 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982663 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982667 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:19.028480 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982671 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982675 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982680 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982686 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982692 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982697 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982701 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982713 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982717 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982721 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982725 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982730 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982734 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982738 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982744 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982749 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982754 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982758 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982762 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:19.029077 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982767 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982771 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982775 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982779 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982783 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982788 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982791 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982796 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982800 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982803 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982807 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982811 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982816 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982821 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982825 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982828 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982832 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982836 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982840 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.982843 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:19.029974 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983374 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983381 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983385 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983389 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983393 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983398 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983401 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983405 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983409 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983414 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983417 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983421 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983426 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983430 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983436 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983440 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983445 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983448 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983453 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:19.030789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983457 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983461 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983465 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983477 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983480 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983484 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983489 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983492 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983496 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983500 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983503 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983507 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983511 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983514 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983518 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983521 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983525 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983529 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983533 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:19.031514 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983539 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983545 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983549 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983553 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983557 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983561 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983565 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983570 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983574 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983578 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983582 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983585 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983589 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983593 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983596 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983599 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983602 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983606 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983610 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983613 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:19.032089 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983617 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983621 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983624 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983627 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983631 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983634 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983638 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983642 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983646 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983668 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983672 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983675 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983678 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983684 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983688 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983692 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983695 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983699 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983702 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983706 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:19.032920 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983710 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983714 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983718 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983722 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983725 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983729 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983732 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.983736 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983843 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983856 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983865 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983871 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983878 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983883 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983889 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983895 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983900 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983904 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983910 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983915 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983919 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983929 2567 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:51:19.033491 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983934 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983938 2567 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983942 2567 flags.go:64] FLAG: --cloud-config="" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983946 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983950 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983957 2567 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983962 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983967 2567 flags.go:64] FLAG: --config-dir="" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983970 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983975 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983981 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983994 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.983999 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984004 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984009 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984013 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984017 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984022 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984026 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984032 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984036 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984041 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984046 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984050 2567 flags.go:64] FLAG: --enable-server="true" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984055 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:51:19.034252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984061 2567 flags.go:64] FLAG: --event-burst="100" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984065 2567 flags.go:64] FLAG: --event-qps="50" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984070 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984075 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984079 2567 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984087 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984093 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984097 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984102 2567 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984106 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984110 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984114 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984119 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984123 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984128 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984132 2567 flags.go:64] FLAG: --feature-gates="" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984137 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984142 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984146 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984162 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984167 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984171 2567 flags.go:64] FLAG: --help="false" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984175 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984180 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:51:19.035004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984184 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984188 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984193 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984198 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984206 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984211 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984215 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984219 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984223 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984228 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984232 2567 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984238 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984242 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984246 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984254 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984258 2567 flags.go:64] FLAG: --lock-file="" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984262 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984267 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984271 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984279 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984283 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984287 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984291 2567 flags.go:64] FLAG: --logging-format="text" Apr 21 07:51:19.194552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984295 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:51:19.099216 ip-10-0-130-176 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984300 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984304 2567 flags.go:64] FLAG: --manifest-url="" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984309 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984316 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984328 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984334 2567 flags.go:64] FLAG: --max-pods="110" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984339 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984343 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984348 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984352 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984356 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984361 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984367 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984378 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984383 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984387 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984392 2567 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984396 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984405 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984409 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984414 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984418 2567 flags.go:64] FLAG: --port="10250" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984425 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:51:19.222459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984429 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06bbdca4a5777878e" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984434 2567 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984438 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984443 2567 flags.go:64] FLAG: --register-node="true" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984448 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984452 2567 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984458 2567 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984462 2567 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984467 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984471 2567 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984477 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984481 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984485 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984489 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984497 2567 flags.go:64] FLAG: --runonce="false" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984502 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984506 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984511 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984515 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984519 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984524 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984528 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984534 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984538 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984542 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984546 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:51:19.223522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984551 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984555 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984559 2567 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984564 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984571 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984577 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984581 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984587 2567 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984591 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984595 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984599 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984604 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984608 2567 flags.go:64] FLAG: --v="2" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984615 2567 flags.go:64] FLAG: --version="false" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984620 2567 flags.go:64] FLAG: --vmodule="" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984626 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.984632 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984779 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984784 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984788 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984793 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984797 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984801 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:19.224411 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984804 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984808 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984812 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984816 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984820 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984825 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984829 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984833 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984836 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984841 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984845 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984849 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984853 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984857 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984862 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984866 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984869 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984873 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984877 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984880 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:19.225370 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984884 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984888 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984892 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984895 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984899 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984903 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984907 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984911 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984914 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984918 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984922 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984926 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984930 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984934 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984937 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984941 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984944 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984950 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984953 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984957 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:19.226047 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984960 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984964 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984968 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984972 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984975 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984979 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984983 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984989 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.984994 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985000 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985005 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985009 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985013 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985017 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985021 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985025 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985028 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985032 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985036 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:19.226795 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985041 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985044 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985049 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985052 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985056 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985061 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985065 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985068 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985073 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985076 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985081 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985086 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985090 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985093 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985097 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985101 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985105 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985109 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985112 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985117 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:19.227762 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.985121 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.985895 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.992982 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.993001 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993053 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993059 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993063 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993066 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993069 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993072 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993075 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993078 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993081 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993083 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993086 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993089 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:19.228407 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993091 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993094 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993096 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993099 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993102 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993104 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993107 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993110 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993112 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993115 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993117 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993120 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993122 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993125 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993127 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993131 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993134 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993137 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993139 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993143 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:19.229026 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993146 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993149 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993151 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993154 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993158 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993162 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993165 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993168 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993170 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993173 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993176 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993178 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993181 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993183 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993186 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993188 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993191 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993194 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993197 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:19.229773 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993199 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993202 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993205 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993207 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993210 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993213 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993215 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993218 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993225 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993228 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993231 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993234 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993237 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993240 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993243 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993246 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993248 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993251 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993253 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:19.230395 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993256 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993258 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993261 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993265 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993269 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993272 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993275 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993278 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993281 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993283 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993286 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993288 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993291 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993293 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993296 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:19.449697 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993298 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.993304 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993405 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993410 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993414 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993417 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993421 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993425 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993429 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993432 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993434 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993437 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993441 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993444 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993447 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:19.450201 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993449 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993452 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993454 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993457 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993460 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993463 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993465 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993468 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993470 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993473 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993475 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993478 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993480 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993483 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993486 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993488 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993491 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993493 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993496 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993498 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:19.450971 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993501 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993503 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993506 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993512 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993516 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993519 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993521 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993524 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993527 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993529 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993533 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993535 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993538 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993540 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993543 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993546 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993548 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993551 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993553 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993556 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:19.451508 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993558 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993561 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993564 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993566 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993569 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993571 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993574 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993576 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993579 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993581 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993584 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993587 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993590 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993594 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993597 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993601 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993605 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993608 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993611 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:19.665984 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993613 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993616 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993619 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993621 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993624 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993627 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993631 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993634 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993637 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993640 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993642 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993645 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993647 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:18.993667 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.993673 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:19.666559 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.994513 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.996884 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.998366 2567 server.go:1019] "Starting client certificate rotation" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.998464 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:18.999314 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.029032 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.032427 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.052566 2567 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.058143 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.060777 2567 log.go:25] "Validated CRI v1 image API" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.062160 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.065691 2567 fs.go:135] Filesystem UUIDs: map[08dade24-3bef-4050-aa10-ed3428c8933c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f3852c23-c497-4703-9d7f-84a5883fae9b:/dev/nvme0n1p3] Apr 21 07:51:19.667091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.065707 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.071469 2567 manager.go:217] Machine: {Timestamp:2026-04-21 07:51:19.069294551 +0000 UTC m=+0.467541662 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3067950 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec262137fd464d6bd8e7e193f5e92129 SystemUUID:ec262137-fd46-4d6b-d8e7-e193f5e92129 BootID:ab210744-528d-4f61-9d3c-be1073f09a41 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:64:93:1f:1b:c3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:64:93:1f:1b:c3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:d5:ba:fb:11:61 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.071567 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.071644 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.074036 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.074130 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-176.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.074308 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.074322 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.074340 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.075396 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.077324 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.077442 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.081484 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.081497 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.081513 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.081523 2567 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.081532 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.082681 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:19.667579 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.082699 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.088882 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.091506 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.092973 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.092987 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.092993 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.092998 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093004 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093010 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093016 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093021 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093029 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093035 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093043 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.093052 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.094856 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.094866 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.095211 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-176.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.095344 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.098457 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.098491 2567 server.go:1295] "Started kubelet" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.098749 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.098930 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.098990 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.101564 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.101689 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.101739 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5vvzh" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.105265 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5vvzh" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.107688 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-176.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.107640 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-176.ec2.internal.18a84fe09b8a5e26 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-176.ec2.internal,UID:ip-10-0-130-176.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-176.ec2.internal,},FirstTimestamp:2026-04-21 07:51:19.09846583 +0000 UTC m=+0.496712941,LastTimestamp:2026-04-21 07:51:19.09846583 +0000 UTC m=+0.496712941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-176.ec2.internal,}" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.110544 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.110935 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.111521 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112091 2567 factory.go:55] Registering systemd factory Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112108 2567 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112247 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112258 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112273 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112307 2567 factory.go:153] Registering CRI-O factory Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112318 2567 factory.go:223] Registration of the crio container factory successfully Apr 21 07:51:19.668196 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.112334 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112371 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112396 2567 factory.go:103] Registering Raw factory Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112407 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112417 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.112420 2567 manager.go:1196] Started watching for new ooms in manager Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.113137 2567 manager.go:319] Starting recovery of all containers Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.113796 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.116935 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-176.ec2.internal\" not found" node="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.130245 2567 manager.go:324] Recovery completed Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.134516 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.136817 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.136842 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.136853 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.137292 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.137301 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.137319 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.141287 2567 policy_none.go:49] "None policy: Start" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.141301 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.141310 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188229 2567 manager.go:341] "Starting Device Plugin manager" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.188270 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188284 2567 server.go:85] "Starting device plugin registration server" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188531 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188544 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188624 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188715 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.188725 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.189240 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.189275 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.229781 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.235052 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.235081 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.235099 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.235105 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.235142 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.237726 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.288840 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.289590 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.289617 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.289628 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.289666 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.299570 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.299592 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-176.ec2.internal\": node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.314451 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.335939 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal"] Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.336012 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.336780 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.669284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.336804 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.336814 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339111 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339232 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339260 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339763 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339766 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339786 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339799 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339789 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.339928 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.341947 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.341969 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.342995 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.343022 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.343034 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.376512 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-176.ec2.internal\" not found" node="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.380850 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-176.ec2.internal\" not found" node="ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.414925 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.415095 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.415508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.416124 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff2da2957d53f026a481f44e2475b521-config\") pod \"kube-apiserver-proxy-ip-10-0-130-176.ec2.internal\" (UID: \"ff2da2957d53f026a481f44e2475b521\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.515939 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff2da2957d53f026a481f44e2475b521-config\") pod \"kube-apiserver-proxy-ip-10-0-130-176.ec2.internal\" (UID: \"ff2da2957d53f026a481f44e2475b521\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.670575 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517176 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.671424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517184 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff2da2957d53f026a481f44e2475b521-config\") pod \"kube-apiserver-proxy-ip-10-0-130-176.ec2.internal\" (UID: \"ff2da2957d53f026a481f44e2475b521\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.671424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.517187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f02deaf3d3626f43477aefec04d329f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal\" (UID: \"2f02deaf3d3626f43477aefec04d329f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.671424 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.616687 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.678973 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.678948 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.683244 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.683225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:19.717006 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.716939 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.817490 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.817454 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.917905 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:19.917872 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:19.998229 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.998150 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:51:19.998347 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.998300 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:19.998402 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:19.998333 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:20.018456 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:20.018420 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:20.108140 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.108100 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 07:46:19 +0000 UTC" deadline="2028-01-14 01:42:10.88125042 +0000 UTC" Apr 21 07:51:20.108140 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.108135 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15185h50m50.773118601s" Apr 21 07:51:20.111875 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.111850 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:20.118611 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:20.118590 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-176.ec2.internal\" not found" Apr 21 07:51:20.128025 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.128003 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:20.143929 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.143907 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:20.148517 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.148501 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gcbx7" Apr 21 07:51:20.156098 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.156077 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gcbx7" Apr 21 07:51:20.212600 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.212570 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" Apr 21 07:51:20.226980 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.226957 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:20.228506 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.228492 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" Apr 21 07:51:20.236233 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.236212 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:20.241825 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.241797 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:20.325143 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:20.325123 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:51:21.009121 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.009087 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:21.082946 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.082911 2567 apiserver.go:52] "Watching apiserver" Apr 21 07:51:21.092788 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.092757 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:51:21.093309 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.093280 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-z2d8g","openshift-ovn-kubernetes/ovnkube-node-2phll","kube-system/konnectivity-agent-8fs97","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz","openshift-cluster-node-tuning-operator/tuned-jjpf8","openshift-image-registry/node-ca-pn27z","openshift-multus/multus-d2hxj","openshift-multus/network-metrics-daemon-mbkk9","openshift-network-diagnostics/network-check-target-9d664","openshift-network-operator/iptables-alerter-vrkff","kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal","openshift-dns/node-resolver-f8v2g","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal"] Apr 21 07:51:21.095797 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.095749 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.098195 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.098143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.098741 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.098721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.098916 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.098892 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:51:21.098987 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.098950 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:51:21.099066 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.099050 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.099311 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.099293 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qfw2d\"" Apr 21 07:51:21.101195 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.101013 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.101195 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.101155 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.101390 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.101372 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:51:21.101480 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.101472 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:51:21.101558 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.101535 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.102338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.102011 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:51:21.102338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.102048 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6vth2\"" Apr 21 07:51:21.102338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.102172 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:51:21.103626 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.103603 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qmsx6\"" Apr 21 07:51:21.104058 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.104040 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:51:21.104263 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.104247 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:51:21.106063 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.106041 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.108022 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.107996 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.108383 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.108365 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.108871 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.108398 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.108871 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.108585 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lpxf4\"" Apr 21 07:51:21.110576 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.110558 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:51:21.112906 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.112884 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.115028 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.115006 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.115405 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.115384 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:51:21.115489 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.115446 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jxjsm\"" Apr 21 07:51:21.115610 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.115386 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.117985 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.117882 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv82z\"" Apr 21 07:51:21.118211 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.118139 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:51:21.120136 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.119874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.122207 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.122094 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.122391 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.122371 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.122573 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.122589 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w28fq\"" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.122671 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.122799 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123297 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-system-cni-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123355 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-hostroot\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/14fce2a3-229e-4214-926c-0d2eb411facc-konnectivity-ca\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123403 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-sys\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123425 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-tmp\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123493 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-system-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-k8s-cni-cncf-io\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123563 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-netd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-kubelet\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-etc-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-var-lib-kubelet\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.123897 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123669 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-device-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-etc-selinux\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123716 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prn97\" (UniqueName: \"kubernetes.io/projected/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kube-api-access-prn97\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-modprobe-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-kubernetes\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123789 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.123813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-os-release\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-systemd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124476 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-conf-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-socket-dir-parent\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-netns\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-multus-certs\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-socket-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-systemd\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-os-release\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.124902 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-systemd-units\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-netns\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/14fce2a3-229e-4214-926c-0d2eb411facc-agent-certs\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124800 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-sys-fs\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124822 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysconfig\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124844 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhj4\" (UniqueName: \"kubernetes.io/projected/77cd248d-7f69-4be8-a1e1-3df94ad81274-kube-api-access-qbhj4\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-bin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124910 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-slash\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124934 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-run\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124961 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-tuned\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.124984 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2q9\" (UniqueName: \"kubernetes.io/projected/910435a2-053a-4a3e-9020-156057e0c177-kube-api-access-2w2q9\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125012 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-log-socket\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-registration-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125119 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-cnibin\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.125628 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovn-node-metrics-cert\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125212 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-cnibin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125237 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-cni-binary-copy\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-multus-daemon-config\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-ovn\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125335 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-node-log\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125421 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-config\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826zm\" (UniqueName: \"kubernetes.io/projected/292daedb-8f6d-4fbe-b50d-eff99dbdb227-kube-api-access-826zm\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125496 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125521 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-kubelet\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-multus\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc595\" (UniqueName: \"kubernetes.io/projected/a8a61d55-c981-4fda-bb59-0fc4d138d739-kube-api-access-xc595\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-conf\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-lib-modules\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125645 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbed9d63-ec12-483e-ba8d-a4082bbfd141-host\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.126386 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-host\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125712 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbed9d63-ec12-483e-ba8d-a4082bbfd141-serviceca\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125735 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2x6\" (UniqueName: \"kubernetes.io/projected/7eac19c8-be6a-49df-a01f-690587797f2d-kube-api-access-cz2x6\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125760 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-var-lib-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-etc-kubernetes\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-bin\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-env-overrides\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-script-lib\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9fvf\" (UniqueName: \"kubernetes.io/projected/fbed9d63-ec12-483e-ba8d-a4082bbfd141-kube-api-access-l9fvf\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125944 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.125970 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.126892 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:21.127241 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.126962 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:21.129300 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.129170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.129405 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.129375 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.131629 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.131602 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.132259 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132131 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:51:21.132397 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132380 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vdrb4\"" Apr 21 07:51:21.132472 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132429 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.132566 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132551 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:21.132701 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132686 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:21.134348 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.132837 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q9sp\"" Apr 21 07:51:21.156939 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.156911 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:20 +0000 UTC" deadline="2027-09-17 04:32:54.425479474 +0000 UTC" Apr 21 07:51:21.156939 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.156937 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12332h41m33.268545174s" Apr 21 07:51:21.213504 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.213439 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:51:21.226425 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226394 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9fvf\" (UniqueName: \"kubernetes.io/projected/fbed9d63-ec12-483e-ba8d-a4082bbfd141-kube-api-access-l9fvf\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-system-cni-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.226616 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee4e1b33-295c-4557-9f5f-2cc029155627-iptables-alerter-script\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j66m\" (UniqueName: \"kubernetes.io/projected/ee4e1b33-295c-4557-9f5f-2cc029155627-kube-api-access-7j66m\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-hostroot\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/14fce2a3-229e-4214-926c-0d2eb411facc-konnectivity-ca\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-sys\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-tmp\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226770 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-system-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-k8s-cni-cncf-io\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226899 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-netd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.226926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-kubelet\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.227338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226929 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-sys\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.227338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-etc-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.227338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.226992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-k8s-cni-cncf-io\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.227338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-kubelet\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.227588 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-var-lib-kubelet\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.227700 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4e1b33-295c-4557-9f5f-2cc029155627-host-slash\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.227737 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-device-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.227737 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-etc-selinux\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.227856 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prn97\" (UniqueName: \"kubernetes.io/projected/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kube-api-access-prn97\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.227856 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-modprobe-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.227856 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-kubernetes\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.227997 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.227997 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-os-release\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.227997 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.227997 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227537 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-system-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228188 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.227994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.228188 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-systemd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.228188 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.228188 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-conf-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228188 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-socket-dir-parent\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-netns\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-multus-certs\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-socket-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228288 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-systemd\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.228424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228406 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-os-release\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.228715 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-device-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.228715 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-etc-selinux\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.228892 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-hostroot\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229067 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-multus-certs\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-modprobe-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-systemd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229173 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-socket-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-conf-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229267 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-var-lib-kubelet\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-kubernetes\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229357 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-system-cni-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229459 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-etc-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229519 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-run-netns\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229667 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-ovn-kubernetes\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.228294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-systemd\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-netd\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-os-release\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.231798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229951 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-socket-dir-parent\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.229994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-tmp-dir\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-systemd-units\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-netns\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230118 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/14fce2a3-229e-4214-926c-0d2eb411facc-agent-certs\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230172 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-sys-fs\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysconfig\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhj4\" (UniqueName: \"kubernetes.io/projected/77cd248d-7f69-4be8-a1e1-3df94ad81274-kube-api-access-qbhj4\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230287 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-bin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-slash\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-run\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-tuned\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/14fce2a3-229e-4214-926c-0d2eb411facc-konnectivity-ca\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2q9\" (UniqueName: \"kubernetes.io/projected/910435a2-053a-4a3e-9020-156057e0c177-kube-api-access-2w2q9\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-slash\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230668 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-run\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.230717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-sys-fs\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231107 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.232699 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-log-socket\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-registration-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysconfig\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-cnibin\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovn-node-metrics-cert\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231397 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-bin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-cnibin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231527 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-multus-cni-dir\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-cni-binary-copy\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231559 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-run-netns\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-systemd-units\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231593 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-log-socket\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.231585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-multus-daemon-config\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.231763 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.232127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79b97bca-1c70-43d9-b07b-3b0ac8671a20-registration-dir\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.233495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.232677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-multus-daemon-config\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233073 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-cnibin\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233118 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-cnibin\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-ovn\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-node-log\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-config\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-826zm\" (UniqueName: \"kubernetes.io/projected/292daedb-8f6d-4fbe-b50d-eff99dbdb227-kube-api-access-826zm\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwrs\" (UniqueName: \"kubernetes.io/projected/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-kube-api-access-nzwrs\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-kubelet\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-multus\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc595\" (UniqueName: \"kubernetes.io/projected/a8a61d55-c981-4fda-bb59-0fc4d138d739-kube-api-access-xc595\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-conf\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.233943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-lib-modules\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.234291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbed9d63-ec12-483e-ba8d-a4082bbfd141-host\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-host\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbed9d63-ec12-483e-ba8d-a4082bbfd141-serviceca\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2x6\" (UniqueName: \"kubernetes.io/projected/7eac19c8-be6a-49df-a01f-690587797f2d-kube-api-access-cz2x6\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-hosts-file\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234497 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-var-lib-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-etc-kubernetes\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-bin\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-env-overrides\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-kubelet\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-script-lib\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.234686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbed9d63-ec12-483e-ba8d-a4082bbfd141-host\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.235284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-script-lib\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.235295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-d\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.235528 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:21.735479096 +0000 UTC m=+3.133726214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.235674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbed9d63-ec12-483e-ba8d-a4082bbfd141-serviceca\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.235815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-host-var-lib-cni-multus\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.237132 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.236107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-host\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.237926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.236182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910435a2-053a-4a3e-9020-156057e0c177-cni-binary-copy\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.237926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.236263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-os-release\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.237926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.236373 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-lib-modules\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.237926 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.237459 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eac19c8-be6a-49df-a01f-690587797f2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.238581 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.238343 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovnkube-config\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.238780 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.238755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-var-lib-openvswitch\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.238871 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.238855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910435a2-053a-4a3e-9020-156057e0c177-etc-kubernetes\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.238939 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.238922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-host-cni-bin\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.238996 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.238921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-sysctl-conf\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.239677 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.239474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-tmp\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.240642 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.240356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-node-log\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.240642 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.240422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8a61d55-c981-4fda-bb59-0fc4d138d739-run-ovn\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.240642 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.240563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eac19c8-be6a-49df-a01f-690587797f2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.241627 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.241608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8a61d55-c981-4fda-bb59-0fc4d138d739-env-overrides\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.242760 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.242742 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9fvf\" (UniqueName: \"kubernetes.io/projected/fbed9d63-ec12-483e-ba8d-a4082bbfd141-kube-api-access-l9fvf\") pod \"node-ca-pn27z\" (UID: \"fbed9d63-ec12-483e-ba8d-a4082bbfd141\") " pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.243045 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.242939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/14fce2a3-229e-4214-926c-0d2eb411facc-agent-certs\") pod \"konnectivity-agent-8fs97\" (UID: \"14fce2a3-229e-4214-926c-0d2eb411facc\") " pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.243291 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.243123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77cd248d-7f69-4be8-a1e1-3df94ad81274-etc-tuned\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.243954 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.243427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2q9\" (UniqueName: \"kubernetes.io/projected/910435a2-053a-4a3e-9020-156057e0c177-kube-api-access-2w2q9\") pod \"multus-d2hxj\" (UID: \"910435a2-053a-4a3e-9020-156057e0c177\") " pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.243954 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.243638 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prn97\" (UniqueName: \"kubernetes.io/projected/79b97bca-1c70-43d9-b07b-3b0ac8671a20-kube-api-access-prn97\") pod \"aws-ebs-csi-driver-node-z86cz\" (UID: \"79b97bca-1c70-43d9-b07b-3b0ac8671a20\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.245874 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.245264 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8a61d55-c981-4fda-bb59-0fc4d138d739-ovn-node-metrics-cert\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.245874 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.245600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhj4\" (UniqueName: \"kubernetes.io/projected/77cd248d-7f69-4be8-a1e1-3df94ad81274-kube-api-access-qbhj4\") pod \"tuned-jjpf8\" (UID: \"77cd248d-7f69-4be8-a1e1-3df94ad81274\") " pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.246938 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.246383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" event={"ID":"2f02deaf3d3626f43477aefec04d329f","Type":"ContainerStarted","Data":"9329960bebe8da0de04ba2ac215f8c47c7120b5dc61e9907a791057c33391c08"} Apr 21 07:51:21.246938 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.246640 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc595\" (UniqueName: \"kubernetes.io/projected/a8a61d55-c981-4fda-bb59-0fc4d138d739-kube-api-access-xc595\") pod \"ovnkube-node-2phll\" (UID: \"a8a61d55-c981-4fda-bb59-0fc4d138d739\") " pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.248706 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.248178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2x6\" (UniqueName: \"kubernetes.io/projected/7eac19c8-be6a-49df-a01f-690587797f2d-kube-api-access-cz2x6\") pod \"multus-additional-cni-plugins-z2d8g\" (UID: \"7eac19c8-be6a-49df-a01f-690587797f2d\") " pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.248706 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.248235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" event={"ID":"ff2da2957d53f026a481f44e2475b521","Type":"ContainerStarted","Data":"2857cec65b409a1417a62e3aff2d24805b071f8781ab3b19a0a74388c45b14d5"} Apr 21 07:51:21.249767 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.249724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-826zm\" (UniqueName: \"kubernetes.io/projected/292daedb-8f6d-4fbe-b50d-eff99dbdb227-kube-api-access-826zm\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.335525 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335494 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:21.335525 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee4e1b33-295c-4557-9f5f-2cc029155627-iptables-alerter-script\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.335765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j66m\" (UniqueName: \"kubernetes.io/projected/ee4e1b33-295c-4557-9f5f-2cc029155627-kube-api-access-7j66m\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.335765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4e1b33-295c-4557-9f5f-2cc029155627-host-slash\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.335765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-tmp-dir\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.335765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4e1b33-295c-4557-9f5f-2cc029155627-host-slash\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.335991 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwrs\" (UniqueName: \"kubernetes.io/projected/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-kube-api-access-nzwrs\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.335991 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-hosts-file\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.335991 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.335939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-tmp-dir\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.336143 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.336023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-hosts-file\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.336143 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.336132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee4e1b33-295c-4557-9f5f-2cc029155627-iptables-alerter-script\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.341205 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.341185 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:21.341205 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.341208 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:21.341383 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.341221 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:21.341383 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.341292 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:21.841270754 +0000 UTC m=+3.239517871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:21.343804 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.343782 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j66m\" (UniqueName: \"kubernetes.io/projected/ee4e1b33-295c-4557-9f5f-2cc029155627-kube-api-access-7j66m\") pod \"iptables-alerter-vrkff\" (UID: \"ee4e1b33-295c-4557-9f5f-2cc029155627\") " pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.344187 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.344169 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwrs\" (UniqueName: \"kubernetes.io/projected/4d8fb59f-57b4-44fc-bfd7-f9714c35d8be-kube-api-access-nzwrs\") pod \"node-resolver-f8v2g\" (UID: \"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be\") " pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.409357 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.409321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d2hxj" Apr 21 07:51:21.429460 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.429422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:21.439374 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.439349 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:21.445972 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.445947 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" Apr 21 07:51:21.453529 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.453509 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pn27z" Apr 21 07:51:21.462187 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.462165 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" Apr 21 07:51:21.469820 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.469761 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" Apr 21 07:51:21.478374 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.478347 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f8v2g" Apr 21 07:51:21.484947 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.484927 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vrkff" Apr 21 07:51:21.575845 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.575810 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:21.739612 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.739528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:21.739788 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.739723 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:21.739846 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.739801 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:22.739782825 +0000 UTC m=+4.138029926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:21.941216 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:21.941178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:21.941419 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.941389 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:21.941419 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.941422 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:21.941592 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.941433 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:21.941592 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:21.941487 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:22.941471189 +0000 UTC m=+4.339718310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:22.157975 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.157942 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:20 +0000 UTC" deadline="2027-12-18 13:48:52.453561905 +0000 UTC" Apr 21 07:51:22.157975 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.157973 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14549h57m30.295592512s" Apr 21 07:51:22.176555 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.176501 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14fce2a3_229e_4214_926c_0d2eb411facc.slice/crio-8e0b0f2eb9087c9e311dfbd5798ff4e26beb1346e5ba3a12f0799da12de585ab WatchSource:0}: Error finding container 8e0b0f2eb9087c9e311dfbd5798ff4e26beb1346e5ba3a12f0799da12de585ab: Status 404 returned error can't find the container with id 8e0b0f2eb9087c9e311dfbd5798ff4e26beb1346e5ba3a12f0799da12de585ab Apr 21 07:51:22.178691 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.178628 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbed9d63_ec12_483e_ba8d_a4082bbfd141.slice/crio-acfc90b033fb2642d3989360cb94f93597613b74a3c1f2c0e0714c96b3b68904 WatchSource:0}: Error finding container acfc90b033fb2642d3989360cb94f93597613b74a3c1f2c0e0714c96b3b68904: Status 404 returned error can't find the container with id acfc90b033fb2642d3989360cb94f93597613b74a3c1f2c0e0714c96b3b68904 Apr 21 07:51:22.181743 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.181710 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910435a2_053a_4a3e_9020_156057e0c177.slice/crio-2e01a1cd777df395e6e23009f17ce750a2f40f9cca938ae0ea53ceb0db141970 WatchSource:0}: Error finding container 2e01a1cd777df395e6e23009f17ce750a2f40f9cca938ae0ea53ceb0db141970: Status 404 returned error can't find the container with id 2e01a1cd777df395e6e23009f17ce750a2f40f9cca938ae0ea53ceb0db141970 Apr 21 07:51:22.183790 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.183748 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee4e1b33_295c_4557_9f5f_2cc029155627.slice/crio-51d322461fd5c68d0fa3545c5a73296bdef099edcac0777aa72f3eab82817fdb WatchSource:0}: Error finding container 51d322461fd5c68d0fa3545c5a73296bdef099edcac0777aa72f3eab82817fdb: Status 404 returned error can't find the container with id 51d322461fd5c68d0fa3545c5a73296bdef099edcac0777aa72f3eab82817fdb Apr 21 07:51:22.185789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.185130 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a61d55_c981_4fda_bb59_0fc4d138d739.slice/crio-aa4b26f1e3a7d2515d326352740ae12e78ea2878ba759402361c5bad1cb5a7f4 WatchSource:0}: Error finding container aa4b26f1e3a7d2515d326352740ae12e78ea2878ba759402361c5bad1cb5a7f4: Status 404 returned error can't find the container with id aa4b26f1e3a7d2515d326352740ae12e78ea2878ba759402361c5bad1cb5a7f4 Apr 21 07:51:22.185789 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.185552 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d8fb59f_57b4_44fc_bfd7_f9714c35d8be.slice/crio-bcd6fabbbb134b0d289dd84a90cd502e47fd128d3809536904d15f18b47248b8 WatchSource:0}: Error finding container bcd6fabbbb134b0d289dd84a90cd502e47fd128d3809536904d15f18b47248b8: Status 404 returned error can't find the container with id bcd6fabbbb134b0d289dd84a90cd502e47fd128d3809536904d15f18b47248b8 Apr 21 07:51:22.188340 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.188308 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77cd248d_7f69_4be8_a1e1_3df94ad81274.slice/crio-9cb5bc22d2389dcf64917f9fd96de213a5bdd3d165afd63a111627a5919a958f WatchSource:0}: Error finding container 9cb5bc22d2389dcf64917f9fd96de213a5bdd3d165afd63a111627a5919a958f: Status 404 returned error can't find the container with id 9cb5bc22d2389dcf64917f9fd96de213a5bdd3d165afd63a111627a5919a958f Apr 21 07:51:22.188818 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:22.188714 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b97bca_1c70_43d9_b07b_3b0ac8671a20.slice/crio-4dcb5dc297f5f394f20b1adb06643529c0da997d3cc15926d2e71004054feff5 WatchSource:0}: Error finding container 4dcb5dc297f5f394f20b1adb06643529c0da997d3cc15926d2e71004054feff5: Status 404 returned error can't find the container with id 4dcb5dc297f5f394f20b1adb06643529c0da997d3cc15926d2e71004054feff5 Apr 21 07:51:22.235981 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.235752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:22.236158 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.236089 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:22.250714 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.250683 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"aa4b26f1e3a7d2515d326352740ae12e78ea2878ba759402361c5bad1cb5a7f4"} Apr 21 07:51:22.251564 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.251535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pn27z" event={"ID":"fbed9d63-ec12-483e-ba8d-a4082bbfd141","Type":"ContainerStarted","Data":"acfc90b033fb2642d3989360cb94f93597613b74a3c1f2c0e0714c96b3b68904"} Apr 21 07:51:22.252448 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.252431 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8fs97" event={"ID":"14fce2a3-229e-4214-926c-0d2eb411facc","Type":"ContainerStarted","Data":"8e0b0f2eb9087c9e311dfbd5798ff4e26beb1346e5ba3a12f0799da12de585ab"} Apr 21 07:51:22.253368 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.253350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" event={"ID":"79b97bca-1c70-43d9-b07b-3b0ac8671a20","Type":"ContainerStarted","Data":"4dcb5dc297f5f394f20b1adb06643529c0da997d3cc15926d2e71004054feff5"} Apr 21 07:51:22.254149 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.254130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerStarted","Data":"eb77972b660492c0382edd58026ec0bb96156c933a28717e53b30712928ce19c"} Apr 21 07:51:22.254936 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.254904 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f8v2g" event={"ID":"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be","Type":"ContainerStarted","Data":"bcd6fabbbb134b0d289dd84a90cd502e47fd128d3809536904d15f18b47248b8"} Apr 21 07:51:22.255751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.255735 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d2hxj" event={"ID":"910435a2-053a-4a3e-9020-156057e0c177","Type":"ContainerStarted","Data":"2e01a1cd777df395e6e23009f17ce750a2f40f9cca938ae0ea53ceb0db141970"} Apr 21 07:51:22.256550 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.256533 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" event={"ID":"77cd248d-7f69-4be8-a1e1-3df94ad81274","Type":"ContainerStarted","Data":"9cb5bc22d2389dcf64917f9fd96de213a5bdd3d165afd63a111627a5919a958f"} Apr 21 07:51:22.257424 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.257402 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vrkff" event={"ID":"ee4e1b33-295c-4557-9f5f-2cc029155627","Type":"ContainerStarted","Data":"51d322461fd5c68d0fa3545c5a73296bdef099edcac0777aa72f3eab82817fdb"} Apr 21 07:51:22.747444 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.747379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:22.747641 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.747556 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:22.747641 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.747623 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:24.747602273 +0000 UTC m=+6.145849377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:22.949729 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:22.949692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:22.949912 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.949867 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:22.949912 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.949887 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:22.949912 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.949900 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:22.950061 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:22.949959 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:24.94994052 +0000 UTC m=+6.348187638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:23.242746 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:23.242718 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:23.243150 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:23.242842 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:23.272548 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:23.272514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" event={"ID":"ff2da2957d53f026a481f44e2475b521","Type":"ContainerStarted","Data":"540c831db4d81142cb3e80a047a1800543abbd94e2f36832ae470839983dfbe6"} Apr 21 07:51:24.236333 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.236296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:24.236527 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.236440 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:24.282135 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.280912 2567 generic.go:358] "Generic (PLEG): container finished" podID="2f02deaf3d3626f43477aefec04d329f" containerID="85fb6f69090548c7f1c696c85d924ed4c78ee7a117dc9756f00d4178edf7f302" exitCode=0 Apr 21 07:51:24.282135 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.281936 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" event={"ID":"2f02deaf3d3626f43477aefec04d329f","Type":"ContainerDied","Data":"85fb6f69090548c7f1c696c85d924ed4c78ee7a117dc9756f00d4178edf7f302"} Apr 21 07:51:24.294775 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.294723 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-176.ec2.internal" podStartSLOduration=4.29470594 podStartE2EDuration="4.29470594s" podCreationTimestamp="2026-04-21 07:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:51:23.288885905 +0000 UTC m=+4.687133028" watchObservedRunningTime="2026-04-21 07:51:24.29470594 +0000 UTC m=+5.692953059" Apr 21 07:51:24.765145 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.765105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:24.765322 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.765284 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:24.765379 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.765351 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:28.765331525 +0000 UTC m=+10.163578628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:24.967264 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:24.967164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:24.967454 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.967434 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:24.967522 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.967462 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:24.967522 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.967476 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:24.967624 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:24.967541 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:28.967520553 +0000 UTC m=+10.365767671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:25.239811 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:25.239735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:25.239969 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:25.239850 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:25.285273 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:25.285233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" event={"ID":"2f02deaf3d3626f43477aefec04d329f","Type":"ContainerStarted","Data":"b1259bdfc59904573619f54b63bbe45260e19b50451c6aeeca8b856a7e3ea4e6"} Apr 21 07:51:26.236110 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:26.235620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:26.236110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:26.235782 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:27.241415 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:27.241383 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:27.241904 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:27.241826 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:28.236169 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:28.236098 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:28.236343 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:28.236243 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:28.800619 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:28.800584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:28.801142 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:28.800758 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:28.801142 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:28.800826 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:36.800805003 +0000 UTC m=+18.199052102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:29.002755 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:29.002693 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:29.002942 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:29.002886 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:29.002942 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:29.002905 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:29.002942 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:29.002918 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:29.003113 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:29.002981 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:37.002962884 +0000 UTC m=+18.401210003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:29.239713 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:29.239225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:29.239713 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:29.239336 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:30.236352 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:30.236320 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:30.236876 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:30.236439 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:31.236091 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:31.236036 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:31.236254 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:31.236164 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:32.235552 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:32.235514 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:32.236068 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:32.235643 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:33.235371 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:33.235329 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:33.235554 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:33.235450 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:34.235572 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:34.235534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:34.236097 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:34.235687 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:35.235961 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:35.235875 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:35.236428 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:35.236002 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:36.236301 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:36.236262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:36.236830 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:36.236394 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:36.859511 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:36.859468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:36.859725 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:36.859598 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:36.859725 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:36.859684 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:52.859663346 +0000 UTC m=+34.257910464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:37.060846 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:37.060806 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:37.061024 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:37.060960 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:37.061024 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:37.060978 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:37.061024 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:37.060990 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:37.061176 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:37.061069 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:53.061051962 +0000 UTC m=+34.459299065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:37.236459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:37.236368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:37.236877 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:37.236492 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:38.235835 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:38.235799 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:38.235988 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:38.235933 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:39.237030 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:39.236987 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:39.237496 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:39.237108 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:40.236103 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.235695 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:40.236265 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:40.236240 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:40.313839 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"2d566ec8673810844ec8233a53384285a2a9149ebd0ed80b7c4733c4cde14d5f"} Apr 21 07:51:40.313839 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"19e91cfcf459ecfba589ac456f1d8446c6ea355a40de11cd827a579ee70863de"} Apr 21 07:51:40.313839 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313848 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"ae766935d57fa70002bedfbcd35422bdeb7ec49b85ee545c541313ec727ac752"} Apr 21 07:51:40.314809 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"e6c1ea713468fcc46aba29dd64b19e3a6d0f3a857d440d2ff0e933e0b0f7ebb4"} Apr 21 07:51:40.314809 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"60e146b1feb77cbaafbc9bf4719dc31ef4c2492c30897143cee39d2b54c7b3db"} Apr 21 07:51:40.314809 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.313883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"33f619e3b160bbb968c5a759abf90702ebf80b9aebc1ae07d6831919dc8b5a11"} Apr 21 07:51:40.315343 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.315282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pn27z" event={"ID":"fbed9d63-ec12-483e-ba8d-a4082bbfd141","Type":"ContainerStarted","Data":"30ef3545c96af9b8b2baec5eb80da2961c24132f96f6049e783f5d7ca7eeaed9"} Apr 21 07:51:40.316722 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.316693 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8fs97" event={"ID":"14fce2a3-229e-4214-926c-0d2eb411facc","Type":"ContainerStarted","Data":"18462b4e2fa857f5ddb602480ca980f8441e10fcf82f4b95beda5597eb6050ce"} Apr 21 07:51:40.318347 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.318323 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" event={"ID":"79b97bca-1c70-43d9-b07b-3b0ac8671a20","Type":"ContainerStarted","Data":"feca08f27570f6322c8b4936951abc08edf473e238c1bded3282958088b9eca9"} Apr 21 07:51:40.319780 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.319755 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="70940b3999958f43be049e3e992ad1c8f61cd5852830a2263bccd16f232019c5" exitCode=0 Apr 21 07:51:40.319878 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.319826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"70940b3999958f43be049e3e992ad1c8f61cd5852830a2263bccd16f232019c5"} Apr 21 07:51:40.321127 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.321088 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f8v2g" event={"ID":"4d8fb59f-57b4-44fc-bfd7-f9714c35d8be","Type":"ContainerStarted","Data":"0222f4a698c9c496adc1dcdaff6130cbd667b3d801cc335ee619c9322d75c694"} Apr 21 07:51:40.322609 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.322579 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d2hxj" event={"ID":"910435a2-053a-4a3e-9020-156057e0c177","Type":"ContainerStarted","Data":"249173a25452c13ed1ceaeb4ab9c75037a7e503e37d23c1446d0e1864f8536eb"} Apr 21 07:51:40.324046 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.324025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" event={"ID":"77cd248d-7f69-4be8-a1e1-3df94ad81274","Type":"ContainerStarted","Data":"0174b984c18714a46c88f8edc3866852deed375de75a2c289e0b54ff9f92f708"} Apr 21 07:51:40.326911 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.326878 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pn27z" podStartSLOduration=4.215107715 podStartE2EDuration="21.326868522s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.182904464 +0000 UTC m=+3.581151575" lastFinishedPulling="2026-04-21 07:51:39.294665267 +0000 UTC m=+20.692912382" observedRunningTime="2026-04-21 07:51:40.326696748 +0000 UTC m=+21.724943869" watchObservedRunningTime="2026-04-21 07:51:40.326868522 +0000 UTC m=+21.725115641" Apr 21 07:51:40.327176 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.327147 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-176.ec2.internal" podStartSLOduration=20.327140393 podStartE2EDuration="20.327140393s" podCreationTimestamp="2026-04-21 07:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:51:25.297044574 +0000 UTC m=+6.695291685" watchObservedRunningTime="2026-04-21 07:51:40.327140393 +0000 UTC m=+21.725387514" Apr 21 07:51:40.349534 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.349490 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f8v2g" podStartSLOduration=4.242829716 podStartE2EDuration="21.349476209s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.187999529 +0000 UTC m=+3.586246632" lastFinishedPulling="2026-04-21 07:51:39.294646025 +0000 UTC m=+20.692893125" observedRunningTime="2026-04-21 07:51:40.336904544 +0000 UTC m=+21.735151665" watchObservedRunningTime="2026-04-21 07:51:40.349476209 +0000 UTC m=+21.747723334" Apr 21 07:51:40.349850 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.349818 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8fs97" podStartSLOduration=4.233784021 podStartE2EDuration="21.349810565s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.178887004 +0000 UTC m=+3.577134103" lastFinishedPulling="2026-04-21 07:51:39.29491353 +0000 UTC m=+20.693160647" observedRunningTime="2026-04-21 07:51:40.349388457 +0000 UTC m=+21.747635579" watchObservedRunningTime="2026-04-21 07:51:40.349810565 +0000 UTC m=+21.748057685" Apr 21 07:51:40.365366 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.365321 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jjpf8" podStartSLOduration=4.262412224 podStartE2EDuration="21.365308692s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.191873344 +0000 UTC m=+3.590120457" lastFinishedPulling="2026-04-21 07:51:39.294769818 +0000 UTC m=+20.693016925" observedRunningTime="2026-04-21 07:51:40.364871646 +0000 UTC m=+21.763118767" watchObservedRunningTime="2026-04-21 07:51:40.365308692 +0000 UTC m=+21.763555816" Apr 21 07:51:40.401988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.401925 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d2hxj" podStartSLOduration=4.267228161 podStartE2EDuration="21.4019095s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.188059707 +0000 UTC m=+3.586306809" lastFinishedPulling="2026-04-21 07:51:39.322741038 +0000 UTC m=+20.720988148" observedRunningTime="2026-04-21 07:51:40.400863177 +0000 UTC m=+21.799110296" watchObservedRunningTime="2026-04-21 07:51:40.4019095 +0000 UTC m=+21.800156621" Apr 21 07:51:40.795000 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.794974 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:51:40.980590 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:40.980563 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9cpch"] Apr 21 07:51:41.000905 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.000881 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.001000 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.000948 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:41.089314 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.089278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-dbus\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.089485 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.089321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.089485 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.089352 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-kubelet-config\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.190734 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.190644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-kubelet-config\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.190888 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.190753 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-dbus\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.190888 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.190774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-kubelet-config\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.190888 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.190782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.190888 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.190863 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:41.191098 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.190911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a349eff5-16ab-4567-b234-b08f49e937a1-dbus\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.191098 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.190922 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret podName:a349eff5-16ab-4567-b234-b08f49e937a1 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:41.690904229 +0000 UTC m=+23.089151329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret") pod "global-pull-secret-syncer-9cpch" (UID: "a349eff5-16ab-4567-b234-b08f49e937a1") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:41.201760 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.201680 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:51:40.794994559Z","UUID":"fa239041-806f-4d7f-ba8d-fd8f2cb1c17c","Handler":null,"Name":"","Endpoint":""} Apr 21 07:51:41.203441 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.203410 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:51:41.203441 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.203438 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:51:41.235464 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.235434 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:41.235602 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.235538 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:41.328701 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.328662 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" event={"ID":"79b97bca-1c70-43d9-b07b-3b0ac8671a20","Type":"ContainerStarted","Data":"601f2411f85b583407c8289b592e65ab92b688fc9b7498d3b1b7624081bef594"} Apr 21 07:51:41.330429 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.330391 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vrkff" event={"ID":"ee4e1b33-295c-4557-9f5f-2cc029155627","Type":"ContainerStarted","Data":"c6d94d59fa85e40d050d9a190744c4d7c56859cf01b1ecbd83887ac0edb39049"} Apr 21 07:51:41.352349 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.352299 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vrkff" podStartSLOduration=5.234157554 podStartE2EDuration="22.352285661s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.188042218 +0000 UTC m=+3.586289320" lastFinishedPulling="2026-04-21 07:51:39.306170326 +0000 UTC m=+20.704417427" observedRunningTime="2026-04-21 07:51:41.352051053 +0000 UTC m=+22.750298173" watchObservedRunningTime="2026-04-21 07:51:41.352285661 +0000 UTC m=+22.750532782" Apr 21 07:51:41.695828 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:41.695799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:41.696014 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.695951 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:41.696073 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:41.696024 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret podName:a349eff5-16ab-4567-b234-b08f49e937a1 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:42.696005045 +0000 UTC m=+24.094252147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret") pod "global-pull-secret-syncer-9cpch" (UID: "a349eff5-16ab-4567-b234-b08f49e937a1") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:42.130428 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.130394 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:42.131048 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.131024 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:42.235311 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.235274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:42.235311 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.235298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:42.235543 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:42.235409 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:42.235543 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:42.235529 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:42.337017 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.336957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" event={"ID":"79b97bca-1c70-43d9-b07b-3b0ac8671a20","Type":"ContainerStarted","Data":"a31513c44ccb437b5514c8f55885891156f3a0687005e0045830f582989a5d49"} Apr 21 07:51:42.356459 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.356411 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z86cz" podStartSLOduration=3.63535678 podStartE2EDuration="23.356395934s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.191689226 +0000 UTC m=+3.589936336" lastFinishedPulling="2026-04-21 07:51:41.912728382 +0000 UTC m=+23.310975490" observedRunningTime="2026-04-21 07:51:42.356139647 +0000 UTC m=+23.754386771" watchObservedRunningTime="2026-04-21 07:51:42.356395934 +0000 UTC m=+23.754643098" Apr 21 07:51:42.702217 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:42.701971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:42.702217 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:42.702162 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:42.702407 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:42.702262 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret podName:a349eff5-16ab-4567-b234-b08f49e937a1 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:44.702240968 +0000 UTC m=+26.100488067 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret") pod "global-pull-secret-syncer-9cpch" (UID: "a349eff5-16ab-4567-b234-b08f49e937a1") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:43.236141 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:43.236104 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:43.236303 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:43.236239 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:43.342020 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:43.341975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"035dfb7b8089f3c37b2f9b90fb3269889a6303b67d8e91b04a2b45bcfe0ddb19"} Apr 21 07:51:43.342020 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:43.342003 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:51:44.235681 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:44.235639 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:44.235681 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:44.235678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:44.235919 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:44.235760 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:44.235919 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:44.235885 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:44.716170 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:44.716000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:44.717018 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:44.716144 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:44.717018 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:44.716276 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret podName:a349eff5-16ab-4567-b234-b08f49e937a1 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:48.716262165 +0000 UTC m=+30.114509263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret") pod "global-pull-secret-syncer-9cpch" (UID: "a349eff5-16ab-4567-b234-b08f49e937a1") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:45.236136 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.236103 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:45.236308 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:45.236220 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:45.348175 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.348138 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="de2970a2ab5fd7dcf30dbbb1ba7ba835de83777ec77a11c56b45b37f05498ecd" exitCode=0 Apr 21 07:51:45.348335 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.348224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"de2970a2ab5fd7dcf30dbbb1ba7ba835de83777ec77a11c56b45b37f05498ecd"} Apr 21 07:51:45.351240 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.351216 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" event={"ID":"a8a61d55-c981-4fda-bb59-0fc4d138d739","Type":"ContainerStarted","Data":"491a7f87a9178145d3a82b222f7d9ab8fd8f4f35c6d1858cfd0e86225f59b10f"} Apr 21 07:51:45.351504 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.351485 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:45.351598 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.351509 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:45.365588 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.365569 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:45.394982 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:45.394947 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" podStartSLOduration=8.936503844 podStartE2EDuration="26.394936211s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.188043263 +0000 UTC m=+3.586290366" lastFinishedPulling="2026-04-21 07:51:39.646475632 +0000 UTC m=+21.044722733" observedRunningTime="2026-04-21 07:51:45.393630119 +0000 UTC m=+26.791877239" watchObservedRunningTime="2026-04-21 07:51:45.394936211 +0000 UTC m=+26.793183330" Apr 21 07:51:46.236317 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.236188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:46.236317 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.236197 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:46.236778 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:46.236321 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:46.236778 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:46.236387 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:46.355134 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.355055 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="ed8f7e610972235319dd50b6a12b2e49a2072b6a58bcc5500f2e663a034177b4" exitCode=0 Apr 21 07:51:46.355283 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.355136 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"ed8f7e610972235319dd50b6a12b2e49a2072b6a58bcc5500f2e663a034177b4"} Apr 21 07:51:46.355762 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.355679 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:46.369995 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.369975 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:51:46.500376 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.500349 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9cpch"] Apr 21 07:51:46.500525 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.500433 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:46.500525 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:46.500510 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:46.503084 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.503057 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9d664"] Apr 21 07:51:46.503210 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.503168 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:46.503292 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:46.503269 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:46.503700 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.503682 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mbkk9"] Apr 21 07:51:46.503776 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:46.503760 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:46.503902 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:46.503855 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:47.358595 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:47.358562 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="2b9fc83ddf4e31747223be97f93d2514e3d08f7070634cbad8b5fdb3a664a49f" exitCode=0 Apr 21 07:51:47.359014 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:47.358670 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"2b9fc83ddf4e31747223be97f93d2514e3d08f7070634cbad8b5fdb3a664a49f"} Apr 21 07:51:48.236201 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:48.236162 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:48.236363 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:48.236269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:48.236363 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:48.236291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:48.236443 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:48.236368 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:48.236443 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:48.236268 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:48.236543 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:48.236460 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:48.747553 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:48.747514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:48.748157 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:48.747647 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:48.748157 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:48.747738 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret podName:a349eff5-16ab-4567-b234-b08f49e937a1 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.747718497 +0000 UTC m=+38.145965605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret") pod "global-pull-secret-syncer-9cpch" (UID: "a349eff5-16ab-4567-b234-b08f49e937a1") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:49.874113 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:49.873885 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:49.874572 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:49.874241 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:51:49.874714 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:49.874678 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8fs97" Apr 21 07:51:50.235332 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:50.235297 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:50.235502 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:50.235303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:50.235502 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:50.235459 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:50.235502 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:50.235314 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:50.235675 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:50.235510 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:50.235675 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:50.235574 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:52.236330 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.236269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:52.237064 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.236361 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:52.237064 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.236380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:52.237064 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.236492 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cpch" podUID="a349eff5-16ab-4567-b234-b08f49e937a1" Apr 21 07:51:52.237064 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.236588 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:51:52.237064 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.236700 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9d664" podUID="b183c600-7bbc-4275-b1b6-1a71e7b6cc15" Apr 21 07:51:52.351000 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.350931 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-176.ec2.internal" event="NodeReady" Apr 21 07:51:52.351146 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.351079 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:51:52.384088 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.384052 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v"] Apr 21 07:51:52.419722 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.419686 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:51:52.419935 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.419839 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.423097 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.422426 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 07:51:52.423097 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.422707 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 07:51:52.423097 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.422971 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 07:51:52.424173 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.424134 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 07:51:52.424295 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.424211 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9cchs\"" Apr 21 07:51:52.434765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.434721 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct"] Apr 21 07:51:52.435070 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.435034 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.437122 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.437104 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:51:52.437289 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.437271 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:51:52.437349 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.437312 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cp6pq\"" Apr 21 07:51:52.437401 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.437276 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:51:52.442139 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.442118 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:51:52.449988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.449944 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp"] Apr 21 07:51:52.450307 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.450052 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.452341 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.452321 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 07:51:52.473537 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473514 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v"] Apr 21 07:51:52.473684 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473546 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct"] Apr 21 07:51:52.473684 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473559 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp"] Apr 21 07:51:52.473684 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473568 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:51:52.473684 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473579 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5krbh"] Apr 21 07:51:52.473684 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.473618 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.476156 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.476136 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 07:51:52.476282 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.476268 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 07:51:52.476360 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.476314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 07:51:52.476360 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.476345 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 07:51:52.485467 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.485449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5krbh"] Apr 21 07:51:52.485565 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.485547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.488083 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.488034 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:51:52.488083 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.488058 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:51:52.488246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.488098 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:51:52.488246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.488123 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:51:52.500348 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.500330 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gt2km"] Apr 21 07:51:52.519161 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.519140 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gt2km"] Apr 21 07:51:52.519286 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.519268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.521859 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.521718 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:51:52.521859 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.521785 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:51:52.522020 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.521995 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:51:52.578119 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2tc\" (UniqueName: \"kubernetes.io/projected/2b660957-c93c-4199-8ce0-edaea697da94-kube-api-access-xl2tc\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2b660957-c93c-4199-8ce0-edaea697da94-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578186 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578213 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b900c2ec-54d0-46cc-802a-261f6c72720b-tmp\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8fl\" (UniqueName: \"kubernetes.io/projected/b900c2ec-54d0-46cc-802a-261f6c72720b-kube-api-access-vj8fl\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578288 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578279 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578303 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5q8\" (UniqueName: \"kubernetes.io/projected/257661c1-c0fc-4a0f-a1b7-4bef97abe473-kube-api-access-ng5q8\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx8cf\" (UniqueName: \"kubernetes.io/projected/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-kube-api-access-zx8cf\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578395 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578502 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578550 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9tp5\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578636 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b900c2ec-54d0-46cc-802a-261f6c72720b-klusterlet-config\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.578730 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.579251 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.578747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/257661c1-c0fc-4a0f-a1b7-4bef97abe473-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.679613 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.679613 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.679613 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.679932 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.679932 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.679932 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.679722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.680338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680296 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9tp5\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.680439 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b900c2ec-54d0-46cc-802a-261f6c72720b-klusterlet-config\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.680439 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.680439 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.680439 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qq9\" (UniqueName: \"kubernetes.io/projected/051ee8de-c3b9-4235-b367-e1804c2a570a-kube-api-access-46qq9\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.680593 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/257661c1-c0fc-4a0f-a1b7-4bef97abe473-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.680593 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680504 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2tc\" (UniqueName: \"kubernetes.io/projected/2b660957-c93c-4199-8ce0-edaea697da94-kube-api-access-xl2tc\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.680593 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680529 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/051ee8de-c3b9-4235-b367-e1804c2a570a-config-volume\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.680593 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2b660957-c93c-4199-8ce0-edaea697da94-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.680800 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.680800 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.680642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b900c2ec-54d0-46cc-802a-261f6c72720b-tmp\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8fl\" (UniqueName: \"kubernetes.io/projected/b900c2ec-54d0-46cc-802a-261f6c72720b-kube-api-access-vj8fl\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682578 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/051ee8de-c3b9-4235-b367-e1804c2a570a-tmp-dir\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5q8\" (UniqueName: \"kubernetes.io/projected/257661c1-c0fc-4a0f-a1b7-4bef97abe473-kube-api-access-ng5q8\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.682743 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx8cf\" (UniqueName: \"kubernetes.io/projected/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-kube-api-access-zx8cf\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.683451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b900c2ec-54d0-46cc-802a-261f6c72720b-tmp\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.683858 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.683876 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.683949 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:53.183928926 +0000 UTC m=+34.582176040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:51:52.685346 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.685133 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.685368 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.685374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b900c2ec-54d0-46cc-802a-261f6c72720b-klusterlet-config\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.685483 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.685549 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:53.185531815 +0000 UTC m=+34.583778930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.685666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.686227 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.685760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2b660957-c93c-4199-8ce0-edaea697da94-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.688352 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.688210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-ca\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.689350 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.689325 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2tc\" (UniqueName: \"kubernetes.io/projected/2b660957-c93c-4199-8ce0-edaea697da94-kube-api-access-xl2tc\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.690128 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.689783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.690241 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.690161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b660957-c93c-4199-8ce0-edaea697da94-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-947548f64-g4qpp\" (UID: \"2b660957-c93c-4199-8ce0-edaea697da94\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.690547 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.690519 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx8cf\" (UniqueName: \"kubernetes.io/projected/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-kube-api-access-zx8cf\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:52.690871 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.690845 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9tp5\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.691205 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.691151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/257661c1-c0fc-4a0f-a1b7-4bef97abe473-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.692802 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.692777 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.692998 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.692980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5q8\" (UniqueName: \"kubernetes.io/projected/257661c1-c0fc-4a0f-a1b7-4bef97abe473-kube-api-access-ng5q8\") pod \"managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v\" (UID: \"257661c1-c0fc-4a0f-a1b7-4bef97abe473\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.693071 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.693026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8fl\" (UniqueName: \"kubernetes.io/projected/b900c2ec-54d0-46cc-802a-261f6c72720b-kube-api-access-vj8fl\") pod \"klusterlet-addon-workmgr-798bf55c97-2rlct\" (UID: \"b900c2ec-54d0-46cc-802a-261f6c72720b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.696766 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.696748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:52.743978 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.743937 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" Apr 21 07:51:52.760811 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.760780 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:52.783540 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.783503 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:51:52.783733 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.783704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/051ee8de-c3b9-4235-b367-e1804c2a570a-config-volume\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.783870 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.783771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.783870 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.783821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/051ee8de-c3b9-4235-b367-e1804c2a570a-tmp-dir\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.783979 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.783879 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46qq9\" (UniqueName: \"kubernetes.io/projected/051ee8de-c3b9-4235-b367-e1804c2a570a-kube-api-access-46qq9\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.783979 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.783885 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:51:52.783979 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.783951 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:53.283930039 +0000 UTC m=+34.682177137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:51:52.784255 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.784164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/051ee8de-c3b9-4235-b367-e1804c2a570a-tmp-dir\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.784408 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.784385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/051ee8de-c3b9-4235-b367-e1804c2a570a-config-volume\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.792596 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.792576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qq9\" (UniqueName: \"kubernetes.io/projected/051ee8de-c3b9-4235-b367-e1804c2a570a-kube-api-access-46qq9\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:52.884261 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:52.884227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:52.884561 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.884462 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:52.884561 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:52.884535 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:24.884516622 +0000 UTC m=+66.282763722 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:53.010898 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.010694 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v"] Apr 21 07:51:53.011510 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.011490 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp"] Apr 21 07:51:53.014981 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.014954 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct"] Apr 21 07:51:53.058787 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:53.058752 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod257661c1_c0fc_4a0f_a1b7_4bef97abe473.slice/crio-0751236bbc3997b2732c0a58d0ca9bd9f667e3aefbd4c572ee2cbb6fec6529b7 WatchSource:0}: Error finding container 0751236bbc3997b2732c0a58d0ca9bd9f667e3aefbd4c572ee2cbb6fec6529b7: Status 404 returned error can't find the container with id 0751236bbc3997b2732c0a58d0ca9bd9f667e3aefbd4c572ee2cbb6fec6529b7 Apr 21 07:51:53.058988 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:53.058966 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b660957_c93c_4199_8ce0_edaea697da94.slice/crio-52b42106299eeb0da15e826121f6569f9118317c9e3a51cb39f521e4ee1a27ba WatchSource:0}: Error finding container 52b42106299eeb0da15e826121f6569f9118317c9e3a51cb39f521e4ee1a27ba: Status 404 returned error can't find the container with id 52b42106299eeb0da15e826121f6569f9118317c9e3a51cb39f521e4ee1a27ba Apr 21 07:51:53.066383 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:53.066354 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb900c2ec_54d0_46cc_802a_261f6c72720b.slice/crio-7fbbc52ca378310db53cc3aedf85c5dfc11e1fd83e702664a5e98425c930b74a WatchSource:0}: Error finding container 7fbbc52ca378310db53cc3aedf85c5dfc11e1fd83e702664a5e98425c930b74a: Status 404 returned error can't find the container with id 7fbbc52ca378310db53cc3aedf85c5dfc11e1fd83e702664a5e98425c930b74a Apr 21 07:51:53.086399 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.086379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:53.086518 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.086504 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:53.086597 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.086522 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:53.086597 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.086541 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5ps2z for pod openshift-network-diagnostics/network-check-target-9d664: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:53.086597 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.086588 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z podName:b183c600-7bbc-4275-b1b6-1a71e7b6cc15 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:25.086573931 +0000 UTC m=+66.484821030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ps2z" (UniqueName: "kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z") pod "network-check-target-9d664" (UID: "b183c600-7bbc-4275-b1b6-1a71e7b6cc15") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:53.187585 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.187528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:53.187781 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.187613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:53.187781 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.187751 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:51:53.187887 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.187832 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:54.187812062 +0000 UTC m=+35.586059385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:51:53.187887 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.187753 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:51:53.187887 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.187861 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:51:53.188089 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.187924 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:54.187897709 +0000 UTC m=+35.586144817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:51:53.288246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.288219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:53.288723 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.288370 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:51:53.288723 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:53.288438 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:54.288421372 +0000 UTC m=+35.686668470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:51:53.372839 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.372802 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" event={"ID":"257661c1-c0fc-4a0f-a1b7-4bef97abe473","Type":"ContainerStarted","Data":"0751236bbc3997b2732c0a58d0ca9bd9f667e3aefbd4c572ee2cbb6fec6529b7"} Apr 21 07:51:53.375398 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.375369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerStarted","Data":"9cc9109e5555b8f0c83ebfb1c0321642c8788cf4cdb5944542aa35b822558151"} Apr 21 07:51:53.376469 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.376445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerStarted","Data":"52b42106299eeb0da15e826121f6569f9118317c9e3a51cb39f521e4ee1a27ba"} Apr 21 07:51:53.377447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:53.377420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" event={"ID":"b900c2ec-54d0-46cc-802a-261f6c72720b","Type":"ContainerStarted","Data":"7fbbc52ca378310db53cc3aedf85c5dfc11e1fd83e702664a5e98425c930b74a"} Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.196182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.196335 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.196525 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.196542 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.196602 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.196580287 +0000 UTC m=+37.594827394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.197020 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:51:54.197110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.197072 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.197056751 +0000 UTC m=+37.595303852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:51:54.236692 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.235786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:54.236692 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.235815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:51:54.236692 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.235815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:51:54.239135 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.239097 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:51:54.241447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.239963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:51:54.241447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.240194 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:51:54.241447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.240412 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:51:54.241447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.240700 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.241447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.240934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.297367 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.297332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:54.297902 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.297514 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:51:54.297902 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:54.297588 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.297567679 +0000 UTC m=+37.695814779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:51:54.388129 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.387789 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="9cc9109e5555b8f0c83ebfb1c0321642c8788cf4cdb5944542aa35b822558151" exitCode=0 Apr 21 07:51:54.388129 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:54.387875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"9cc9109e5555b8f0c83ebfb1c0321642c8788cf4cdb5944542aa35b822558151"} Apr 21 07:51:55.394106 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:55.394060 2567 generic.go:358] "Generic (PLEG): container finished" podID="7eac19c8-be6a-49df-a01f-690587797f2d" containerID="0e9b29b1bbee98c7760dd0895fd3e5749e96e628e2df20dd8dfe2067a4c2672c" exitCode=0 Apr 21 07:51:55.394620 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:55.394136 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerDied","Data":"0e9b29b1bbee98c7760dd0895fd3e5749e96e628e2df20dd8dfe2067a4c2672c"} Apr 21 07:51:56.216089 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.216046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:51:56.216271 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.216139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:51:56.216271 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.216220 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:51:56.216271 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.216256 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:51:56.216271 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.216272 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:51:56.216422 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.216289 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:00.216273708 +0000 UTC m=+41.614520809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:51:56.216422 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.216321 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:00.216305131 +0000 UTC m=+41.614552234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:51:56.317163 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.317077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:51:56.317321 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.317246 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:51:56.317376 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:51:56.317325 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:00.317303334 +0000 UTC m=+41.715550438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:51:56.820182 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.820124 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:56.824073 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.824043 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a349eff5-16ab-4567-b234-b08f49e937a1-original-pull-secret\") pod \"global-pull-secret-syncer-9cpch\" (UID: \"a349eff5-16ab-4567-b234-b08f49e937a1\") " pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:56.956214 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:56.956172 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cpch" Apr 21 07:51:59.336059 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.336000 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9cpch"] Apr 21 07:51:59.339718 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:51:59.339684 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda349eff5_16ab_4567_b234_b08f49e937a1.slice/crio-eacc501f6a7df2193d6e9ff293dc89b6d3aa75e68539a5c501e7535c9dbb4036 WatchSource:0}: Error finding container eacc501f6a7df2193d6e9ff293dc89b6d3aa75e68539a5c501e7535c9dbb4036: Status 404 returned error can't find the container with id eacc501f6a7df2193d6e9ff293dc89b6d3aa75e68539a5c501e7535c9dbb4036 Apr 21 07:51:59.405815 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.405781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" event={"ID":"7eac19c8-be6a-49df-a01f-690587797f2d","Type":"ContainerStarted","Data":"01dbc769586fd033421911938c960618264e5556edd8af31a5811ffcd89dc61f"} Apr 21 07:51:59.407252 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.407224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" event={"ID":"b900c2ec-54d0-46cc-802a-261f6c72720b","Type":"ContainerStarted","Data":"b90e040323e14033dfa0c8dd4a6f4183fd603a1c21b4f4742a6ab8104d39fbb8"} Apr 21 07:51:59.407504 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.407485 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:59.408557 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.408533 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" event={"ID":"257661c1-c0fc-4a0f-a1b7-4bef97abe473","Type":"ContainerStarted","Data":"5fe19c57521cc93a042266ee8a69f356b14aee2f6ca0ceea4d1035207c56ceb7"} Apr 21 07:51:59.409058 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.409042 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:51:59.409811 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.409785 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9cpch" event={"ID":"a349eff5-16ab-4567-b234-b08f49e937a1","Type":"ContainerStarted","Data":"eacc501f6a7df2193d6e9ff293dc89b6d3aa75e68539a5c501e7535c9dbb4036"} Apr 21 07:51:59.411685 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.411634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerStarted","Data":"6c5a2220ec8174b5b2f7c20739c73059c1a9429f7c97b5cc3354dbede6809bac"} Apr 21 07:51:59.426172 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.426102 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z2d8g" podStartSLOduration=9.511598503 podStartE2EDuration="40.426089709s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:51:22.192366995 +0000 UTC m=+3.590614093" lastFinishedPulling="2026-04-21 07:51:53.106858201 +0000 UTC m=+34.505105299" observedRunningTime="2026-04-21 07:51:59.425076132 +0000 UTC m=+40.823323254" watchObservedRunningTime="2026-04-21 07:51:59.426089709 +0000 UTC m=+40.824336826" Apr 21 07:51:59.444298 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.444259 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" podStartSLOduration=22.320267278 podStartE2EDuration="28.444249593s" podCreationTimestamp="2026-04-21 07:51:31 +0000 UTC" firstStartedPulling="2026-04-21 07:51:53.083750312 +0000 UTC m=+34.481997410" lastFinishedPulling="2026-04-21 07:51:59.207732608 +0000 UTC m=+40.605979725" observedRunningTime="2026-04-21 07:51:59.443513794 +0000 UTC m=+40.841760917" watchObservedRunningTime="2026-04-21 07:51:59.444249593 +0000 UTC m=+40.842496733" Apr 21 07:51:59.456585 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:51:59.456551 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" podStartSLOduration=22.323730514 podStartE2EDuration="28.456541596s" podCreationTimestamp="2026-04-21 07:51:31 +0000 UTC" firstStartedPulling="2026-04-21 07:51:53.060394007 +0000 UTC m=+34.458641105" lastFinishedPulling="2026-04-21 07:51:59.193205086 +0000 UTC m=+40.591452187" observedRunningTime="2026-04-21 07:51:59.45644939 +0000 UTC m=+40.854696523" watchObservedRunningTime="2026-04-21 07:51:59.456541596 +0000 UTC m=+40.854788716" Apr 21 07:52:00.249284 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:00.249251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:52:00.249475 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:00.249324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:52:00.249475 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.249414 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:00.249583 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.249483 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:52:00.249583 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.249504 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:52:00.249583 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.249490 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:08.24946979 +0000 UTC m=+49.647716888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:52:00.249583 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.249566 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:08.249553856 +0000 UTC m=+49.647800959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:52:00.349748 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:00.349708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:52:00.350170 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.349892 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:00.350170 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:00.349954 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:08.349936355 +0000 UTC m=+49.748183458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:52:02.419590 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:02.419496 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerStarted","Data":"7f40cee3cd911a9b13774df2634fd55b0cc5469d1e96f518ca516197443290d1"} Apr 21 07:52:02.419590 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:02.419546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerStarted","Data":"65eafaa46fe627a1b06913df0becae0619aca732cb0029d84f7e577485304153"} Apr 21 07:52:02.437097 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:02.437050 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" podStartSLOduration=22.367188132 podStartE2EDuration="31.437036544s" podCreationTimestamp="2026-04-21 07:51:31 +0000 UTC" firstStartedPulling="2026-04-21 07:51:53.060433867 +0000 UTC m=+34.458680972" lastFinishedPulling="2026-04-21 07:52:02.130282281 +0000 UTC m=+43.528529384" observedRunningTime="2026-04-21 07:52:02.436784102 +0000 UTC m=+43.835031222" watchObservedRunningTime="2026-04-21 07:52:02.437036544 +0000 UTC m=+43.835283664" Apr 21 07:52:05.429821 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:05.429779 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9cpch" event={"ID":"a349eff5-16ab-4567-b234-b08f49e937a1","Type":"ContainerStarted","Data":"60f6edc996558399ac22bcd49362a1279aba93760b73e8d30748824071b9d50c"} Apr 21 07:52:05.443990 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:05.443943 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9cpch" podStartSLOduration=20.496874236 podStartE2EDuration="25.44392852s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:59.34168189 +0000 UTC m=+40.739928991" lastFinishedPulling="2026-04-21 07:52:04.288736168 +0000 UTC m=+45.686983275" observedRunningTime="2026-04-21 07:52:05.442911332 +0000 UTC m=+46.841158455" watchObservedRunningTime="2026-04-21 07:52:05.44392852 +0000 UTC m=+46.842175639" Apr 21 07:52:08.315179 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:08.315144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:08.315217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.315300 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.315337 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.315349 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.315375 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:24.315354164 +0000 UTC m=+65.713601268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:52:08.315516 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.315392 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:24.315381597 +0000 UTC m=+65.713628699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:52:08.415829 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:08.415783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:52:08.416005 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.415926 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:08.416005 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:08.415991 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:24.415976671 +0000 UTC m=+65.814223769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:52:18.372082 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:18.372054 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2phll" Apr 21 07:52:24.334001 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:24.333961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:24.334038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.334116 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.334172 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:56.334158547 +0000 UTC m=+97.732405648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.334179 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.334198 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:52:24.334372 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.334253 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:56.334240062 +0000 UTC m=+97.732487163 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:52:24.434418 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:24.434387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:52:24.434585 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.434504 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:24.434585 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.434557 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:56.434541472 +0000 UTC m=+97.832788570 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:52:24.939577 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:24.939545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:52:24.941973 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:24.941949 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:52:24.950030 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.950006 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:52:24.950110 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:24.950084 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:28.950060737 +0000 UTC m=+130.348307835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : secret "metrics-daemon-secret" not found Apr 21 07:52:25.140952 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.140913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:52:25.143414 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.143392 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:52:25.153131 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.153105 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:52:25.165023 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.164996 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ps2z\" (UniqueName: \"kubernetes.io/projected/b183c600-7bbc-4275-b1b6-1a71e7b6cc15-kube-api-access-5ps2z\") pod \"network-check-target-9d664\" (UID: \"b183c600-7bbc-4275-b1b6-1a71e7b6cc15\") " pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:52:25.177879 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.177856 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:52:25.186606 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.186590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:52:25.300079 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.300046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9d664"] Apr 21 07:52:25.304692 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:52:25.304644 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb183c600_7bbc_4275_b1b6_1a71e7b6cc15.slice/crio-ec85bf60581d8dbfef0ef83eb49aee7a6341b95ab10b95262e0ae5d13813a853 WatchSource:0}: Error finding container ec85bf60581d8dbfef0ef83eb49aee7a6341b95ab10b95262e0ae5d13813a853: Status 404 returned error can't find the container with id ec85bf60581d8dbfef0ef83eb49aee7a6341b95ab10b95262e0ae5d13813a853 Apr 21 07:52:25.479004 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:25.478961 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9d664" event={"ID":"b183c600-7bbc-4275-b1b6-1a71e7b6cc15","Type":"ContainerStarted","Data":"ec85bf60581d8dbfef0ef83eb49aee7a6341b95ab10b95262e0ae5d13813a853"} Apr 21 07:52:29.490240 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:29.490196 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9d664" event={"ID":"b183c600-7bbc-4275-b1b6-1a71e7b6cc15","Type":"ContainerStarted","Data":"06bc050b9a3613b59919071e81502dbdfa1d222d7a46d956ed4b9cb27bc3a0a6"} Apr 21 07:52:29.490749 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:29.490414 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:52:29.504967 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:29.504922 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9d664" podStartSLOduration=67.266091161 podStartE2EDuration="1m10.504909614s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:52:25.306459323 +0000 UTC m=+66.704706421" lastFinishedPulling="2026-04-21 07:52:28.545277773 +0000 UTC m=+69.943524874" observedRunningTime="2026-04-21 07:52:29.503416362 +0000 UTC m=+70.901663483" watchObservedRunningTime="2026-04-21 07:52:29.504909614 +0000 UTC m=+70.903156712" Apr 21 07:52:56.388000 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:56.387909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:52:56.388000 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:56.387971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:52:56.388438 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.388055 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:56.388438 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.388085 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:52:56.388438 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.388100 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6466bc5fb5-c9dfh: secret "image-registry-tls" not found Apr 21 07:52:56.388438 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.388131 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert podName:ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:00.388110166 +0000 UTC m=+161.786357265 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert") pod "ingress-canary-5krbh" (UID: "ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8") : secret "canary-serving-cert" not found Apr 21 07:52:56.388438 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.388179 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls podName:408521c6-dc50-4c6a-b423-65779840ad61 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:00.388162775 +0000 UTC m=+161.786409878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls") pod "image-registry-6466bc5fb5-c9dfh" (UID: "408521c6-dc50-4c6a-b423-65779840ad61") : secret "image-registry-tls" not found Apr 21 07:52:56.488933 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:52:56.488885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:52:56.489072 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.489032 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:56.489109 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:52:56.489094 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls podName:051ee8de-c3b9-4235-b367-e1804c2a570a nodeName:}" failed. No retries permitted until 2026-04-21 07:54:00.489079711 +0000 UTC m=+161.887326809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls") pod "dns-default-gt2km" (UID: "051ee8de-c3b9-4235-b367-e1804c2a570a") : secret "dns-default-metrics-tls" not found Apr 21 07:53:00.495303 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:00.495275 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9d664" Apr 21 07:53:28.033523 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:28.033495 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f8v2g_4d8fb59f-57b4-44fc-bfd7-f9714c35d8be/dns-node-resolver/0.log" Apr 21 07:53:29.026028 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:29.025994 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:53:29.026197 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:29.026137 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:53:29.026240 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:29.026216 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs podName:292daedb-8f6d-4fbe-b50d-eff99dbdb227 nodeName:}" failed. No retries permitted until 2026-04-21 07:55:31.026198644 +0000 UTC m=+252.424445741 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs") pod "network-metrics-daemon-mbkk9" (UID: "292daedb-8f6d-4fbe-b50d-eff99dbdb227") : secret "metrics-daemon-secret" not found Apr 21 07:53:29.032988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:29.032969 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pn27z_fbed9d63-ec12-483e-ba8d-a4082bbfd141/node-ca/0.log" Apr 21 07:53:39.655928 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.655896 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8v6xm"] Apr 21 07:53:39.658910 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.658888 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.661272 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.661248 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:53:39.661391 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.661299 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vjklh\"" Apr 21 07:53:39.662189 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.662171 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:53:39.662278 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.662220 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:53:39.662278 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.662236 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:53:39.669145 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.669122 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8v6xm"] Apr 21 07:53:39.711165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.711125 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c48b594-686f-497a-b780-493e11888b34-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.711337 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.711188 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c48b594-686f-497a-b780-493e11888b34-crio-socket\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.711337 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.711215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c48b594-686f-497a-b780-493e11888b34-data-volume\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.711337 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.711255 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.711488 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.711338 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxh4\" (UniqueName: \"kubernetes.io/projected/0c48b594-686f-497a-b780-493e11888b34-kube-api-access-2sxh4\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812559 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812559 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxh4\" (UniqueName: \"kubernetes.io/projected/0c48b594-686f-497a-b780-493e11888b34-kube-api-access-2sxh4\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812775 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c48b594-686f-497a-b780-493e11888b34-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812775 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c48b594-686f-497a-b780-493e11888b34-crio-socket\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812775 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:39.812685 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.812775 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:39.812753 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls podName:0c48b594-686f-497a-b780-493e11888b34 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:40.312737868 +0000 UTC m=+141.710984966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8v6xm" (UID: "0c48b594-686f-497a-b780-493e11888b34") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.812775 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c48b594-686f-497a-b780-493e11888b34-data-volume\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.812954 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c48b594-686f-497a-b780-493e11888b34-crio-socket\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.813009 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.812991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c48b594-686f-497a-b780-493e11888b34-data-volume\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.813183 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.813168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c48b594-686f-497a-b780-493e11888b34-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:39.820895 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:39.820876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxh4\" (UniqueName: \"kubernetes.io/projected/0c48b594-686f-497a-b780-493e11888b34-kube-api-access-2sxh4\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:40.317983 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:40.317939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:40.318170 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:40.318094 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:40.318214 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:40.318176 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls podName:0c48b594-686f-497a-b780-493e11888b34 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:41.318159986 +0000 UTC m=+142.716407085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8v6xm" (UID: "0c48b594-686f-497a-b780-493e11888b34") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.324791 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:41.324735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:41.325185 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:41.324865 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.325185 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:41.324922 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls podName:0c48b594-686f-497a-b780-493e11888b34 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:43.324904985 +0000 UTC m=+144.723152084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8v6xm" (UID: "0c48b594-686f-497a-b780-493e11888b34") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.340278 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:43.340241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:43.340647 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:43.340387 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.340647 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:43.340456 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls podName:0c48b594-686f-497a-b780-493e11888b34 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:47.340438025 +0000 UTC m=+148.738685129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8v6xm" (UID: "0c48b594-686f-497a-b780-493e11888b34") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:47.375017 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:47.374977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:47.375494 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:47.375126 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:47.375494 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:47.375189 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls podName:0c48b594-686f-497a-b780-493e11888b34 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:55.375173653 +0000 UTC m=+156.773420755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8v6xm" (UID: "0c48b594-686f-497a-b780-493e11888b34") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:55.441465 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:55.441420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:55.443749 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:55.443724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c48b594-686f-497a-b780-493e11888b34-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8v6xm\" (UID: \"0c48b594-686f-497a-b780-493e11888b34\") " pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:55.451755 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:55.451723 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podUID="408521c6-dc50-4c6a-b423-65779840ad61" Apr 21 07:53:55.511058 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:55.511011 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5krbh" podUID="ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8" Apr 21 07:53:55.529205 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:55.529171 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gt2km" podUID="051ee8de-c3b9-4235-b367-e1804c2a570a" Apr 21 07:53:55.567539 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:55.567495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8v6xm" Apr 21 07:53:55.684027 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:55.683958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:53:55.685468 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:55.685448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8v6xm"] Apr 21 07:53:55.688028 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:53:55.688004 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c48b594_686f_497a_b780_493e11888b34.slice/crio-a7442998f9e3db8d576bbac732046f4999a71df0f556fcc61259c03f1fb7c901 WatchSource:0}: Error finding container a7442998f9e3db8d576bbac732046f4999a71df0f556fcc61259c03f1fb7c901: Status 404 returned error can't find the container with id a7442998f9e3db8d576bbac732046f4999a71df0f556fcc61259c03f1fb7c901 Apr 21 07:53:56.687303 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:56.687225 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8v6xm" event={"ID":"0c48b594-686f-497a-b780-493e11888b34","Type":"ContainerStarted","Data":"1f6d85e30ed55f035ba4eef1abbfb2022c55857558e874ec2c801831cbfc4af2"} Apr 21 07:53:56.687303 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:56.687257 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8v6xm" event={"ID":"0c48b594-686f-497a-b780-493e11888b34","Type":"ContainerStarted","Data":"2e9589ebf9a12bffa58f87e2c355732bdb763b96cb3ecbfc7c8e9279525cee79"} Apr 21 07:53:56.687303 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:56.687274 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8v6xm" event={"ID":"0c48b594-686f-497a-b780-493e11888b34","Type":"ContainerStarted","Data":"a7442998f9e3db8d576bbac732046f4999a71df0f556fcc61259c03f1fb7c901"} Apr 21 07:53:57.267766 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:53:57.267722 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mbkk9" podUID="292daedb-8f6d-4fbe-b50d-eff99dbdb227" Apr 21 07:53:58.693026 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:58.692994 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8v6xm" event={"ID":"0c48b594-686f-497a-b780-493e11888b34","Type":"ContainerStarted","Data":"d0a68c6b8129d758d1b4e3707d3e924ba345ac6a305adce7ca1d011b62e7236d"} Apr 21 07:53:58.709580 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:58.709538 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8v6xm" podStartSLOduration=17.261625102 podStartE2EDuration="19.709523077s" podCreationTimestamp="2026-04-21 07:53:39 +0000 UTC" firstStartedPulling="2026-04-21 07:53:55.737913709 +0000 UTC m=+157.136160808" lastFinishedPulling="2026-04-21 07:53:58.185811683 +0000 UTC m=+159.584058783" observedRunningTime="2026-04-21 07:53:58.708707392 +0000 UTC m=+160.106954509" watchObservedRunningTime="2026-04-21 07:53:58.709523077 +0000 UTC m=+160.107770190" Apr 21 07:53:59.696448 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.696361 2567 generic.go:358] "Generic (PLEG): container finished" podID="b900c2ec-54d0-46cc-802a-261f6c72720b" containerID="b90e040323e14033dfa0c8dd4a6f4183fd603a1c21b4f4742a6ab8104d39fbb8" exitCode=1 Apr 21 07:53:59.696448 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.696437 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" event={"ID":"b900c2ec-54d0-46cc-802a-261f6c72720b","Type":"ContainerDied","Data":"b90e040323e14033dfa0c8dd4a6f4183fd603a1c21b4f4742a6ab8104d39fbb8"} Apr 21 07:53:59.696983 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.696860 2567 scope.go:117] "RemoveContainer" containerID="b90e040323e14033dfa0c8dd4a6f4183fd603a1c21b4f4742a6ab8104d39fbb8" Apr 21 07:53:59.697756 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.697731 2567 generic.go:358] "Generic (PLEG): container finished" podID="257661c1-c0fc-4a0f-a1b7-4bef97abe473" containerID="5fe19c57521cc93a042266ee8a69f356b14aee2f6ca0ceea4d1035207c56ceb7" exitCode=255 Apr 21 07:53:59.697882 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.697799 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" event={"ID":"257661c1-c0fc-4a0f-a1b7-4bef97abe473","Type":"ContainerDied","Data":"5fe19c57521cc93a042266ee8a69f356b14aee2f6ca0ceea4d1035207c56ceb7"} Apr 21 07:53:59.698209 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:53:59.698195 2567 scope.go:117] "RemoveContainer" containerID="5fe19c57521cc93a042266ee8a69f356b14aee2f6ca0ceea4d1035207c56ceb7" Apr 21 07:54:00.486061 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.486004 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:00.486259 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.486077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:54:00.488475 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.488443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8-cert\") pod \"ingress-canary-5krbh\" (UID: \"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8\") " pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:54:00.488475 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.488456 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"image-registry-6466bc5fb5-c9dfh\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:00.586722 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.586689 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:00.588859 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.588838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/051ee8de-c3b9-4235-b367-e1804c2a570a-metrics-tls\") pod \"dns-default-gt2km\" (UID: \"051ee8de-c3b9-4235-b367-e1804c2a570a\") " pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:00.701176 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.701143 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" event={"ID":"b900c2ec-54d0-46cc-802a-261f6c72720b","Type":"ContainerStarted","Data":"d37f5073145e5d5de4a85f56ef2c240a213b3a38314bec44e0d73a69677cf04f"} Apr 21 07:54:00.701600 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.701474 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:54:00.702196 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.702172 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-798bf55c97-2rlct" Apr 21 07:54:00.702669 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.702632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56dfc4bb8c-pxf5v" event={"ID":"257661c1-c0fc-4a0f-a1b7-4bef97abe473","Type":"ContainerStarted","Data":"924eaa9455a83424140e5db4c4907fb9d619dda27aa318d9602f933bfd7afde3"} Apr 21 07:54:00.787197 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.787107 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cp6pq\"" Apr 21 07:54:00.794965 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.794941 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:00.914918 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:00.914886 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:54:00.918049 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:54:00.918014 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408521c6_dc50_4c6a_b423_65779840ad61.slice/crio-c7d55cbc201bb0cf6a74049aa34d210403ff3a0e99e9bd65124aaf70e03ee110 WatchSource:0}: Error finding container c7d55cbc201bb0cf6a74049aa34d210403ff3a0e99e9bd65124aaf70e03ee110: Status 404 returned error can't find the container with id c7d55cbc201bb0cf6a74049aa34d210403ff3a0e99e9bd65124aaf70e03ee110 Apr 21 07:54:01.709079 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:01.709034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" event={"ID":"408521c6-dc50-4c6a-b423-65779840ad61","Type":"ContainerStarted","Data":"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2"} Apr 21 07:54:01.709079 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:01.709082 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" event={"ID":"408521c6-dc50-4c6a-b423-65779840ad61","Type":"ContainerStarted","Data":"c7d55cbc201bb0cf6a74049aa34d210403ff3a0e99e9bd65124aaf70e03ee110"} Apr 21 07:54:01.709514 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:01.709239 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:01.729294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:01.728640 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podStartSLOduration=162.728623019 podStartE2EDuration="2m42.728623019s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:54:01.728530092 +0000 UTC m=+163.126777213" watchObservedRunningTime="2026-04-21 07:54:01.728623019 +0000 UTC m=+163.126870140" Apr 21 07:54:05.661827 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.661796 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4sjs4"] Apr 21 07:54:05.666693 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.666671 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.670522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.669717 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:54:05.670522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.669795 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wdcf8\"" Apr 21 07:54:05.673558 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.671852 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:54:05.673558 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.672143 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:54:05.673558 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.672402 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:54:05.673558 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.672632 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:54:05.673848 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.673631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:54:05.822975 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.822938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-tls\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.822981 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-textfile\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-sys\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823094 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-root\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823129 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p82d\" (UniqueName: \"kubernetes.io/projected/1be3fb2a-2904-4061-85f3-4a707df568f5-kube-api-access-7p82d\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823165 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823151 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823454 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-metrics-client-ca\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823454 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.823454 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.823320 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-wtmp\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.923988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.923891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.923988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.923939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-wtmp\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.923988 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.923970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-tls\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-textfile\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-sys\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924075 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-root\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924105 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-wtmp\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-sys\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1be3fb2a-2904-4061-85f3-4a707df568f5-root\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p82d\" (UniqueName: \"kubernetes.io/projected/1be3fb2a-2904-4061-85f3-4a707df568f5-kube-api-access-7p82d\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924246 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924601 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-metrics-client-ca\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924601 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-textfile\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924601 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924513 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.924790 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.924774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1be3fb2a-2904-4061-85f3-4a707df568f5-metrics-client-ca\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.926439 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.926415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-tls\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.926530 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.926453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1be3fb2a-2904-4061-85f3-4a707df568f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.931486 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.931467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p82d\" (UniqueName: \"kubernetes.io/projected/1be3fb2a-2904-4061-85f3-4a707df568f5-kube-api-access-7p82d\") pod \"node-exporter-4sjs4\" (UID: \"1be3fb2a-2904-4061-85f3-4a707df568f5\") " pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.979043 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:05.979014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4sjs4" Apr 21 07:54:05.986812 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:54:05.986780 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be3fb2a_2904_4061_85f3_4a707df568f5.slice/crio-b0b598d15debe1c1385a773caf31e1bea756849f83813413c44edf9717cfee73 WatchSource:0}: Error finding container b0b598d15debe1c1385a773caf31e1bea756849f83813413c44edf9717cfee73: Status 404 returned error can't find the container with id b0b598d15debe1c1385a773caf31e1bea756849f83813413c44edf9717cfee73 Apr 21 07:54:06.723569 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:06.723527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sjs4" event={"ID":"1be3fb2a-2904-4061-85f3-4a707df568f5","Type":"ContainerStarted","Data":"b0b598d15debe1c1385a773caf31e1bea756849f83813413c44edf9717cfee73"} Apr 21 07:54:07.726927 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:07.726892 2567 generic.go:358] "Generic (PLEG): container finished" podID="1be3fb2a-2904-4061-85f3-4a707df568f5" containerID="88f484b8da5b68f1047578e40ef254f15425407b6d0cd645331edb0abf2d6ff5" exitCode=0 Apr 21 07:54:07.727270 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:07.726971 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sjs4" event={"ID":"1be3fb2a-2904-4061-85f3-4a707df568f5","Type":"ContainerDied","Data":"88f484b8da5b68f1047578e40ef254f15425407b6d0cd645331edb0abf2d6ff5"} Apr 21 07:54:08.236334 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.236301 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:54:08.236483 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.236416 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:08.238862 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.238840 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:54:08.238963 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.238932 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:54:08.246964 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.246939 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5krbh" Apr 21 07:54:08.246964 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.246960 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:08.369037 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.368988 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5krbh"] Apr 21 07:54:08.371092 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:54:08.371065 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccaea438_6e90_4c2d_ae4a_b0e2d90f1eb8.slice/crio-04e1226970826e1f9dc884005aa8c3308efeb8090fd5698b6fe02a51a9bc0086 WatchSource:0}: Error finding container 04e1226970826e1f9dc884005aa8c3308efeb8090fd5698b6fe02a51a9bc0086: Status 404 returned error can't find the container with id 04e1226970826e1f9dc884005aa8c3308efeb8090fd5698b6fe02a51a9bc0086 Apr 21 07:54:08.388577 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.388531 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gt2km"] Apr 21 07:54:08.392804 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:54:08.392779 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051ee8de_c3b9_4235_b367_e1804c2a570a.slice/crio-95c5457b55bc4e29a74cd442a46d8f10ea1ea541d6b15d9463fec590488f5bd1 WatchSource:0}: Error finding container 95c5457b55bc4e29a74cd442a46d8f10ea1ea541d6b15d9463fec590488f5bd1: Status 404 returned error can't find the container with id 95c5457b55bc4e29a74cd442a46d8f10ea1ea541d6b15d9463fec590488f5bd1 Apr 21 07:54:08.730883 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.730854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt2km" event={"ID":"051ee8de-c3b9-4235-b367-e1804c2a570a","Type":"ContainerStarted","Data":"95c5457b55bc4e29a74cd442a46d8f10ea1ea541d6b15d9463fec590488f5bd1"} Apr 21 07:54:08.732694 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.732662 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sjs4" event={"ID":"1be3fb2a-2904-4061-85f3-4a707df568f5","Type":"ContainerStarted","Data":"9709d7007c1da05b0181d43b2270aff763da4220571c01b0364c764036306909"} Apr 21 07:54:08.732819 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.732698 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sjs4" event={"ID":"1be3fb2a-2904-4061-85f3-4a707df568f5","Type":"ContainerStarted","Data":"ba6f272e4ceb8267d2ef919004266680ae8433d481d12833922fa159ef1dfc46"} Apr 21 07:54:08.733606 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.733586 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5krbh" event={"ID":"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8","Type":"ContainerStarted","Data":"04e1226970826e1f9dc884005aa8c3308efeb8090fd5698b6fe02a51a9bc0086"} Apr 21 07:54:08.750737 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:08.750701 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4sjs4" podStartSLOduration=2.937105129 podStartE2EDuration="3.750690097s" podCreationTimestamp="2026-04-21 07:54:05 +0000 UTC" firstStartedPulling="2026-04-21 07:54:05.988515312 +0000 UTC m=+167.386762410" lastFinishedPulling="2026-04-21 07:54:06.802100268 +0000 UTC m=+168.200347378" observedRunningTime="2026-04-21 07:54:08.749887214 +0000 UTC m=+170.148134337" watchObservedRunningTime="2026-04-21 07:54:08.750690097 +0000 UTC m=+170.148937217" Apr 21 07:54:11.743209 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.743168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5krbh" event={"ID":"ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8","Type":"ContainerStarted","Data":"ebc7b46302061a3dd65ed12cfc6eb84509af0da85fc36443cce8dfdc9c5997e7"} Apr 21 07:54:11.744637 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.744611 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt2km" event={"ID":"051ee8de-c3b9-4235-b367-e1804c2a570a","Type":"ContainerStarted","Data":"7dab4e7c2dde6e5d3806c8676240a6ee2378dbbd634bf538ab054606022ec76d"} Apr 21 07:54:11.744756 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.744640 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gt2km" event={"ID":"051ee8de-c3b9-4235-b367-e1804c2a570a","Type":"ContainerStarted","Data":"dc4696692ab9d7f9dbf18e1be77573ff2f16545a3533525484d770c81d99dfac"} Apr 21 07:54:11.744756 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.744707 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:11.757518 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.757476 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5krbh" podStartSLOduration=137.301012151 podStartE2EDuration="2m19.757462827s" podCreationTimestamp="2026-04-21 07:51:52 +0000 UTC" firstStartedPulling="2026-04-21 07:54:08.37322161 +0000 UTC m=+169.771468711" lastFinishedPulling="2026-04-21 07:54:10.82967228 +0000 UTC m=+172.227919387" observedRunningTime="2026-04-21 07:54:11.757069821 +0000 UTC m=+173.155316944" watchObservedRunningTime="2026-04-21 07:54:11.757462827 +0000 UTC m=+173.155709946" Apr 21 07:54:11.779111 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:11.779059 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gt2km" podStartSLOduration=137.347115502 podStartE2EDuration="2m19.779047208s" podCreationTimestamp="2026-04-21 07:51:52 +0000 UTC" firstStartedPulling="2026-04-21 07:54:08.39459322 +0000 UTC m=+169.792840317" lastFinishedPulling="2026-04-21 07:54:10.826524922 +0000 UTC m=+172.224772023" observedRunningTime="2026-04-21 07:54:11.777793532 +0000 UTC m=+173.176040660" watchObservedRunningTime="2026-04-21 07:54:11.779047208 +0000 UTC m=+173.177294327" Apr 21 07:54:12.235648 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:12.235609 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:54:20.798899 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:20.798865 2567 patch_prober.go:28] interesting pod/image-registry-6466bc5fb5-c9dfh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:20.799358 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:20.798925 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:21.749744 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:21.749712 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gt2km" Apr 21 07:54:22.715567 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:22.715532 2567 patch_prober.go:28] interesting pod/image-registry-6466bc5fb5-c9dfh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:22.715937 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:22.715593 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:30.218462 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:30.218427 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:54:30.223718 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:30.223690 2567 patch_prober.go:28] interesting pod/image-registry-6466bc5fb5-c9dfh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:30.223858 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:30.223741 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:40.222627 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:40.222595 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:42.784716 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:42.784673 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" podUID="2b660957-c93c-4199-8ce0-edaea697da94" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:54:46.706774 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:46.706749 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/init-textfile/0.log" Apr 21 07:54:46.907377 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:46.907350 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/node-exporter/0.log" Apr 21 07:54:47.106435 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:47.106410 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/kube-rbac-proxy/0.log" Apr 21 07:54:52.785225 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:52.785186 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" podUID="2b660957-c93c-4199-8ce0-edaea697da94" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:54:54.508694 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:54.508647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt2km_051ee8de-c3b9-4235-b367-e1804c2a570a/dns/0.log" Apr 21 07:54:54.707032 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:54.706994 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt2km_051ee8de-c3b9-4235-b367-e1804c2a570a/kube-rbac-proxy/0.log" Apr 21 07:54:55.237493 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:55.237425 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" containerID="cri-o://f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2" gracePeriod=30 Apr 21 07:54:55.306268 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:55.306242 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f8v2g_4d8fb59f-57b4-44fc-bfd7-f9714c35d8be/dns-node-resolver/0.log" Apr 21 07:54:56.106872 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.106842 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5krbh_ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8/serve-healthcheck-canary/0.log" Apr 21 07:54:56.479391 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.479371 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:56.593495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593405 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593495 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593474 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593611 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593646 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593701 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593725 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.593751 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593745 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9tp5\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.594005 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593787 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets\") pod \"408521c6-dc50-4c6a-b423-65779840ad61\" (UID: \"408521c6-dc50-4c6a-b423-65779840ad61\") " Apr 21 07:54:56.594060 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.593945 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:54:56.594276 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.594242 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:54:56.596633 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.596590 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:54:56.596633 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.596597 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:54:56.596633 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.596603 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:54:56.596810 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.596714 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5" (OuterVolumeSpecName: "kube-api-access-v9tp5") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "kube-api-access-v9tp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:54:56.596810 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.596736 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:54:56.604405 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.604370 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "408521c6-dc50-4c6a-b423-65779840ad61" (UID: "408521c6-dc50-4c6a-b423-65779840ad61"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:54:56.694992 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.694957 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-bound-sa-token\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.694992 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.694987 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-trusted-ca\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.694992 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.694998 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-image-registry-private-configuration\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.695294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.695008 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-registry-tls\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.695294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.695017 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408521c6-dc50-4c6a-b423-65779840ad61-ca-trust-extracted\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.695294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.695026 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408521c6-dc50-4c6a-b423-65779840ad61-registry-certificates\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.695294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.695035 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v9tp5\" (UniqueName: \"kubernetes.io/projected/408521c6-dc50-4c6a-b423-65779840ad61-kube-api-access-v9tp5\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.695294 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.695044 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408521c6-dc50-4c6a-b423-65779840ad61-installation-pull-secrets\") on node \"ip-10-0-130-176.ec2.internal\" DevicePath \"\"" Apr 21 07:54:56.860185 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.860087 2567 generic.go:358] "Generic (PLEG): container finished" podID="408521c6-dc50-4c6a-b423-65779840ad61" containerID="f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2" exitCode=0 Apr 21 07:54:56.860185 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.860143 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" Apr 21 07:54:56.860185 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.860167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" event={"ID":"408521c6-dc50-4c6a-b423-65779840ad61","Type":"ContainerDied","Data":"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2"} Apr 21 07:54:56.860405 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.860204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6466bc5fb5-c9dfh" event={"ID":"408521c6-dc50-4c6a-b423-65779840ad61","Type":"ContainerDied","Data":"c7d55cbc201bb0cf6a74049aa34d210403ff3a0e99e9bd65124aaf70e03ee110"} Apr 21 07:54:56.860405 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.860221 2567 scope.go:117] "RemoveContainer" containerID="f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2" Apr 21 07:54:56.868572 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.868537 2567 scope.go:117] "RemoveContainer" containerID="f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2" Apr 21 07:54:56.868855 ip-10-0-130-176 kubenswrapper[2567]: E0421 07:54:56.868828 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2\": container with ID starting with f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2 not found: ID does not exist" containerID="f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2" Apr 21 07:54:56.868962 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.868861 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2"} err="failed to get container status \"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2\": rpc error: code = NotFound desc = could not find container \"f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2\": container with ID starting with f12664297ed92c34e8c162adc3a25f9da5d9b24211422e6218329aa18a3405f2 not found: ID does not exist" Apr 21 07:54:56.880640 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.880615 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:54:56.884185 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:56.884160 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6466bc5fb5-c9dfh"] Apr 21 07:54:57.239450 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:54:57.239421 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408521c6-dc50-4c6a-b423-65779840ad61" path="/var/lib/kubelet/pods/408521c6-dc50-4c6a-b423-65779840ad61/volumes" Apr 21 07:55:02.784871 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:02.784829 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" podUID="2b660957-c93c-4199-8ce0-edaea697da94" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:55:02.785420 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:02.784893 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" Apr 21 07:55:02.785420 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:02.785336 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7f40cee3cd911a9b13774df2634fd55b0cc5469d1e96f518ca516197443290d1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 07:55:02.785420 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:02.785375 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" podUID="2b660957-c93c-4199-8ce0-edaea697da94" containerName="service-proxy" containerID="cri-o://7f40cee3cd911a9b13774df2634fd55b0cc5469d1e96f518ca516197443290d1" gracePeriod=30 Apr 21 07:55:03.883771 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:03.883737 2567 generic.go:358] "Generic (PLEG): container finished" podID="2b660957-c93c-4199-8ce0-edaea697da94" containerID="7f40cee3cd911a9b13774df2634fd55b0cc5469d1e96f518ca516197443290d1" exitCode=2 Apr 21 07:55:03.884141 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:03.883791 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerDied","Data":"7f40cee3cd911a9b13774df2634fd55b0cc5469d1e96f518ca516197443290d1"} Apr 21 07:55:03.884141 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:03.883817 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-947548f64-g4qpp" event={"ID":"2b660957-c93c-4199-8ce0-edaea697da94","Type":"ContainerStarted","Data":"b6af792d893248c72517546d361110482abc65bc3ac012fdb8fb4feecf05d8e2"} Apr 21 07:55:31.051497 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.051456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:55:31.053789 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.053764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/292daedb-8f6d-4fbe-b50d-eff99dbdb227-metrics-certs\") pod \"network-metrics-daemon-mbkk9\" (UID: \"292daedb-8f6d-4fbe-b50d-eff99dbdb227\") " pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:55:31.139622 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.139595 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:55:31.147805 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.147786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mbkk9" Apr 21 07:55:31.263531 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.263502 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mbkk9"] Apr 21 07:55:31.267152 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:55:31.267120 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292daedb_8f6d_4fbe_b50d_eff99dbdb227.slice/crio-f82c2a7b95d58243d979c6b904d42fb217511668939eeba1a8e853bda486b08c WatchSource:0}: Error finding container f82c2a7b95d58243d979c6b904d42fb217511668939eeba1a8e853bda486b08c: Status 404 returned error can't find the container with id f82c2a7b95d58243d979c6b904d42fb217511668939eeba1a8e853bda486b08c Apr 21 07:55:31.954260 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:31.954222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mbkk9" event={"ID":"292daedb-8f6d-4fbe-b50d-eff99dbdb227","Type":"ContainerStarted","Data":"f82c2a7b95d58243d979c6b904d42fb217511668939eeba1a8e853bda486b08c"} Apr 21 07:55:32.961338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:32.961301 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mbkk9" event={"ID":"292daedb-8f6d-4fbe-b50d-eff99dbdb227","Type":"ContainerStarted","Data":"22af0da77c38386004c5440608e89c12e3efa63fb6e403bacf7bf6db59f3af26"} Apr 21 07:55:32.961338 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:32.961342 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mbkk9" event={"ID":"292daedb-8f6d-4fbe-b50d-eff99dbdb227","Type":"ContainerStarted","Data":"a76b3bfa6cd309e066c5f1cd95a075b788c75006af1635a512aa1d765adae9ef"} Apr 21 07:55:32.980020 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:55:32.979971 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mbkk9" podStartSLOduration=252.931544671 podStartE2EDuration="4m13.979958756s" podCreationTimestamp="2026-04-21 07:51:19 +0000 UTC" firstStartedPulling="2026-04-21 07:55:31.269335783 +0000 UTC m=+252.667582881" lastFinishedPulling="2026-04-21 07:55:32.317749865 +0000 UTC m=+253.715996966" observedRunningTime="2026-04-21 07:55:32.978742585 +0000 UTC m=+254.376989704" watchObservedRunningTime="2026-04-21 07:55:32.979958756 +0000 UTC m=+254.378205875" Apr 21 07:56:19.126412 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:19.126386 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 07:56:34.476606 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:34.476575 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9cpch_a349eff5-16ab-4567-b234-b08f49e937a1/global-pull-secret-syncer/0.log" Apr 21 07:56:34.609784 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:34.609755 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8fs97_14fce2a3-229e-4214-926c-0d2eb411facc/konnectivity-agent/0.log" Apr 21 07:56:34.677543 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:34.677513 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-176.ec2.internal_ff2da2957d53f026a481f44e2475b521/haproxy/0.log" Apr 21 07:56:37.905743 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:37.905713 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/node-exporter/0.log" Apr 21 07:56:37.925824 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:37.925800 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/kube-rbac-proxy/0.log" Apr 21 07:56:37.950141 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:37.950117 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sjs4_1be3fb2a-2904-4061-85f3-4a707df568f5/init-textfile/0.log" Apr 21 07:56:41.202290 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.202257 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n"] Apr 21 07:56:41.202697 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.202526 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" Apr 21 07:56:41.202697 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.202538 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" Apr 21 07:56:41.202697 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.202586 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="408521c6-dc50-4c6a-b423-65779840ad61" containerName="registry" Apr 21 07:56:41.205318 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.205294 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.207728 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.207705 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"kube-root-ca.crt\"" Apr 21 07:56:41.208639 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.208621 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9fzd8\"/\"default-dockercfg-v97gr\"" Apr 21 07:56:41.208639 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.208630 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"openshift-service-ca.crt\"" Apr 21 07:56:41.212874 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.212845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n"] Apr 21 07:56:41.333645 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.333621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-proc\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.333798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.333674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd5dm\" (UniqueName: \"kubernetes.io/projected/e7052997-428d-48c4-870d-077114242511-kube-api-access-kd5dm\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.333798 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.333745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-podres\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.333906 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.333799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-sys\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.333906 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.333830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-lib-modules\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.429767 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.429722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt2km_051ee8de-c3b9-4235-b367-e1804c2a570a/dns/0.log" Apr 21 07:56:41.434530 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-podres\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434641 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-sys\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434641 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-lib-modules\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434740 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434642 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-sys\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434740 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-podres\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434740 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-proc\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434740 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434732 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-proc\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434888 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd5dm\" (UniqueName: \"kubernetes.io/projected/e7052997-428d-48c4-870d-077114242511-kube-api-access-kd5dm\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.434888 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.434822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7052997-428d-48c4-870d-077114242511-lib-modules\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.442025 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.442002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd5dm\" (UniqueName: \"kubernetes.io/projected/e7052997-428d-48c4-870d-077114242511-kube-api-access-kd5dm\") pod \"perf-node-gather-daemonset-wc64n\" (UID: \"e7052997-428d-48c4-870d-077114242511\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.446279 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.446260 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gt2km_051ee8de-c3b9-4235-b367-e1804c2a570a/kube-rbac-proxy/0.log" Apr 21 07:56:41.498741 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.498715 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f8v2g_4d8fb59f-57b4-44fc-bfd7-f9714c35d8be/dns-node-resolver/0.log" Apr 21 07:56:41.515593 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.515570 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:41.624805 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.624716 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n"] Apr 21 07:56:41.627238 ip-10-0-130-176 kubenswrapper[2567]: W0421 07:56:41.627207 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode7052997_428d_48c4_870d_077114242511.slice/crio-9903de215b5566d49dd728908f26f02009eafdf31ebefd6483164d0a7da4f57b WatchSource:0}: Error finding container 9903de215b5566d49dd728908f26f02009eafdf31ebefd6483164d0a7da4f57b: Status 404 returned error can't find the container with id 9903de215b5566d49dd728908f26f02009eafdf31ebefd6483164d0a7da4f57b Apr 21 07:56:41.628918 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.628898 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:56:41.936765 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:41.936668 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pn27z_fbed9d63-ec12-483e-ba8d-a4082bbfd141/node-ca/0.log" Apr 21 07:56:42.140709 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:42.140675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" event={"ID":"e7052997-428d-48c4-870d-077114242511","Type":"ContainerStarted","Data":"e1de0a41763372e4d7dec1ab413fda649406ed55e39fba77e92a734a2280a813"} Apr 21 07:56:42.140709 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:42.140709 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" event={"ID":"e7052997-428d-48c4-870d-077114242511","Type":"ContainerStarted","Data":"9903de215b5566d49dd728908f26f02009eafdf31ebefd6483164d0a7da4f57b"} Apr 21 07:56:42.140955 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:42.140804 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:42.156522 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:42.156473 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" podStartSLOduration=1.156459236 podStartE2EDuration="1.156459236s" podCreationTimestamp="2026-04-21 07:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:56:42.155616558 +0000 UTC m=+323.553863678" watchObservedRunningTime="2026-04-21 07:56:42.156459236 +0000 UTC m=+323.554706355" Apr 21 07:56:42.832806 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:42.832774 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5krbh_ccaea438-6e90-4c2d-ae4a-b0e2d90f1eb8/serve-healthcheck-canary/0.log" Apr 21 07:56:43.196037 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:43.195936 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8v6xm_0c48b594-686f-497a-b780-493e11888b34/kube-rbac-proxy/0.log" Apr 21 07:56:43.213551 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:43.213524 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8v6xm_0c48b594-686f-497a-b780-493e11888b34/exporter/0.log" Apr 21 07:56:43.239173 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:43.239146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8v6xm_0c48b594-686f-497a-b780-493e11888b34/extractor/0.log" Apr 21 07:56:48.152090 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.152057 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-wc64n" Apr 21 07:56:48.186943 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.186920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/kube-multus-additional-cni-plugins/0.log" Apr 21 07:56:48.204982 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.204955 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/egress-router-binary-copy/0.log" Apr 21 07:56:48.222195 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.222172 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/cni-plugins/0.log" Apr 21 07:56:48.241060 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.241036 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/bond-cni-plugin/0.log" Apr 21 07:56:48.257623 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.257584 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/routeoverride-cni/0.log" Apr 21 07:56:48.274332 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.274311 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/whereabouts-cni-bincopy/0.log" Apr 21 07:56:48.292214 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.292198 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z2d8g_7eac19c8-be6a-49df-a01f-690587797f2d/whereabouts-cni/0.log" Apr 21 07:56:48.316499 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.316471 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d2hxj_910435a2-053a-4a3e-9020-156057e0c177/kube-multus/0.log" Apr 21 07:56:48.452914 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.452835 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mbkk9_292daedb-8f6d-4fbe-b50d-eff99dbdb227/network-metrics-daemon/0.log" Apr 21 07:56:48.468832 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:48.468810 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mbkk9_292daedb-8f6d-4fbe-b50d-eff99dbdb227/kube-rbac-proxy/0.log" Apr 21 07:56:49.064126 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.064094 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/ovn-controller/0.log" Apr 21 07:56:49.082663 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.082615 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/ovn-acl-logging/0.log" Apr 21 07:56:49.101923 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.101898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/kube-rbac-proxy-node/0.log" Apr 21 07:56:49.120145 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.120125 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:56:49.134107 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.134078 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/northd/0.log" Apr 21 07:56:49.153768 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.153733 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/nbdb/0.log" Apr 21 07:56:49.171370 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.171328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/sbdb/0.log" Apr 21 07:56:49.313447 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:49.313416 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2phll_a8a61d55-c981-4fda-bb59-0fc4d138d739/ovnkube-controller/0.log" Apr 21 07:56:50.882924 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:50.882858 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9d664_b183c600-7bbc-4275-b1b6-1a71e7b6cc15/network-check-target-container/0.log" Apr 21 07:56:51.757393 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:51.757355 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vrkff_ee4e1b33-295c-4557-9f5f-2cc029155627/iptables-alerter/0.log" Apr 21 07:56:52.422556 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:52.422455 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jjpf8_77cd248d-7f69-4be8-a1e1-3df94ad81274/tuned/0.log" Apr 21 07:56:55.317797 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:55.317765 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-z86cz_79b97bca-1c70-43d9-b07b-3b0ac8671a20/csi-driver/0.log" Apr 21 07:56:55.336398 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:55.336361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-z86cz_79b97bca-1c70-43d9-b07b-3b0ac8671a20/csi-node-driver-registrar/0.log" Apr 21 07:56:55.353061 ip-10-0-130-176 kubenswrapper[2567]: I0421 07:56:55.353041 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-z86cz_79b97bca-1c70-43d9-b07b-3b0ac8671a20/csi-liveness-probe/0.log"